Room 42 is where practitioners and academics meet to share knowledge about breaking research. In this episode, Pam Estes Brewer explains how practitioners can turn innovation in the field into publishable research than can build change and support career growth.
Season 1, Episode 10 | 44 min
Transcript (Expand to View)
[00:00:13.420] - Liz Fraley
Good morning, everyone, and welcome to Room 42. I'm Liz Fraley from Single-Sourcing Solutions. I'm your moderator. This is Janice Summers, our interviewer. And welcome to Pam Estes Brewer from Mercer University.
[00:00:27.440] - Pam Estes Brewer
[00:00:28.000] - Liz Fraley
She's amazing. She's been on the TC Dojo and in Room 42 before. We're always glad to have her back. She's been at Mercer a really long time. You've been working with remote for a long time. You've been doing research in organizational process and maturity.
[00:00:49.250] - Liz Fraley
You've been working on remote teaming. You've been working in technical communications research. And you've been working with engineering schools, with experience labs, with homeland security. You're all over the place, and you've been working with journals. You edited some journals over the time?
[00:01:08.180] - Pam Estes Brewer
Special issues and on some boards. Yeah.
[00:01:10.250] - Liz Fraley
There we go. She's been everywhere, and we're super glad that you're here with us today.
[00:01:15.080] - Pam Estes Brewer
Thank you. It's good to be here. Yeah.
[00:01:17.750] - Janice Summers
Yeah, indeed. It is very nice for you. I think this is your first Room 42, right?
[00:01:23.930] - Pam Estes Brewer
[00:01:25.080] - Janice Summers
Yeah. This is your first official. Welcome.
[00:01:28.370] - Pam Estes Brewer
One of my dojos turned into a Room 42.
[00:01:30.830] - Janice Summers
[00:01:35.510] - Liz Fraley
So today, we're talking about how practitioners can get published and ignite change. Take it away, Janice.
[00:01:42.730] - Janice Summers
Yes, which is really important. I think people want to know. And actually, we've gotten questions of like, well, how do I go about doing this? Because we kind of want to elevate what we're doing, and we want to make change it. But how do you even begin? And what's the proper methodology? What's the channel to get this, pam? Help. What do we do?
[00:02:10.220] - Pam Estes Brewer
Well, first, let me just acknowledge something. Practitioners aren't directly rewarded on the job for thinking publication where it caters are, it's part of ... But a lot or maybe even the majority of innovation comes out of practice. And we are losing a lot if we don't as practitioners or as collaborators with practitioners try to bring some of that innovation into the field, so that it can be used.
[00:02:51.980] - Pam Estes Brewer
And I know that the proprietary stuff know. No, that's that's not what we're talking about. Your company wants to protect its capital. But when we're talking about some of our practices, there's a lot that can be learned. And I just want to kind of lay the groundwork here that I am a research geek. I totally get off on research, because it feels powerful to be able to look at something that needs change or look at something that's just flat out interesting.
[00:03:24.920] - Pam Estes Brewer
And to be able to study that in a way that can provide credible results that you can then use in any way you want to. Think in terms of being a change agent in your organization. Think in terms of publication. Think in terms of leadership. Think in terms of your recognition and reputation as a professional and as an expert. Publishing your research is can accomplish all of those things for you.
[00:03:56.060] - Janice Summers
Well, and let's face it. I mean, because you start out with a thought or a theory or understanding is supposition. Research really helps to make it really come into full dimension, and validates what you think or what you might suppose. So getting that extra weight and then the discipline of actually going through and publishing that research, I think adds even more validity to your supposition to help you bolster igniting that change, right?
[00:04:29.480] - Pam Estes Brewer
[00:04:30.560] - Janice Summers
It's all to kind of create that positive momentum that you're desiring to create. All of this gives it more weight, right?
[00:04:39.980] - Pam Estes Brewer
Absolutely. Absolutely. Peter saying used the term abstraction wars. Sit and debate something with your colleagues all day long, and throw your opinions. But until you have evidence, it's nothing more than an abstraction war. You can think politics today, abstraction wars. It's... until you know how to study it to present reliable valid data that you've got a real case to make.
[00:05:23.630] - Janice Summers
Right. And I think that's like a really important piece that's often overlooked. There's a lot of rigor in getting and collecting that data. But until you have that data, it's really just an opinion. Does that make sense? To me, it's just an opinion.
[00:05:44.752] - Pam Estes Brewer
No, It is. It's theory.
[00:05:45.829] - Janice Summers
[00:05:46.890] - Pam Estes Brewer
It's a claim. The claim of evidence.
[00:05:49.260] - Liz Fraley
[00:05:49.890] - Janice Summers
Well, yeah, it doesn't have any teeth in it. And I think that's kind of a dangerous thing to make claims without doing that rigor, right? So how would someone go about like they have an idea. They have a concept. How would they go about getting teeth in their hypothesis?
[00:06:16.070] - Pam Estes Brewer
Well, the one word that I would throw out there first, and it's not the only word is systematic. If you're going to present evidence for a claim to present evidence that is replicable, which means if somebody else tried to present that evidence, they would be able to do it as well. reproducible meaning that you've provided the evidence, so now other people can build on it. And it's data driven. You know, you've got evidence based on data.
[00:06:51.560] - Pam Estes Brewer
And to produce those three things, you have to be very systematic about it. So when someone asks me about, okay, so how do you get started? Well, you look for a problem, so a gap between what you have and what you would like to have. Maybe it's a process in your organization. You know, it's not working well for it to work better, but you just have opinions about that right now, and you need to sell it.
[00:07:19.970] - Pam Estes Brewer
Well, then you you've seen that gap, and then you're going to ask yourself, so what methods could I use in order to produce data about that gap that would answer questions about how to solve that problem. And then you gather that data systematically, you analyze it systematically, and you produce it systematically, whether publication in your organization or a publication in a journal or a magazine.
[00:07:58.060] - Janice Summers
And I think when you're looking for data, are you also looking for things that disapprove? But prove and disapprove your thoughts or your theory about the process.
[00:08:13.440] - Pam Estes Brewer
No, I'm not looking for proof, and I'm not looking for disproof. I'm looking for data, and that data either disapproves or proves. If I go into it looking for proof, I have biased my study from the beginning. I can have a claim about what my problem is, but I cannot make a claim, a valid claim about the solution until I have studied it in an unbiased way.
[00:08:46.710] - Pam Estes Brewer
So let's say you've got a process problem at your organization. I can use one at our university right now. Our research process is so cumbersome. By the time a researcher who wants to get funded by, let's say, the National Science Foundation. A time that person takes an award through the approval process and sets up the accounting process, I couldn't even tell you how many signatures are required. And how much replication takes place. And we need to find a solution for that.
[00:09:29.080] - Pam Estes Brewer
And right now in the middle of identifying the problem, find alternatives for solving the problem, and then studying the data about our proposed solution. So we're actually getting ready to run a pilot to collect the data to tell us whether or not the solution we think that's going to work is going to work.
[00:09:52.280] - Liz Fraley
So what kinds of things did you gather to find out about this? I'm sort of curious about the study itself now.
[00:10:01.970] - Pam Estes Brewer
I would give great credit to a colleague of mine who has identified a piece of software that we... We actually use this tiny little piece of it here already, but we're not using it correctly. And so she has put together a proposal for the pilot to take this piece of software, and implement it through out three of the colleges and schools within our university.
[00:10:29.330] - Pam Estes Brewer
We're actually going to run it. Well, we have to get approval. It goes to administration within the next couple of weeks, actually. So it's that kind of research. So that may or may not be interesting outside of our university. I think it will be. I think it's going to be interesting to see if this pilot reveals that we could use this, and greatly improve our costs, as well as the amount of funding we're bringing in.
[00:11:03.080] - Pam Estes Brewer
But I think this is going to be something interesting to other universities once we're finished, and then it would might be publishable. Yeah, go ahead. I can think of another example.
[00:11:14.580] - Liz Fraley
Also, I was thinking more in terms of... So for me, sometimes, I need practical steps. I need to see an example like my math showing anywhere, right? I need to see some examples, so I can extrapolate them. So it takes a lot of signatures and there's replication. So what kind of signature? What kind of replication? How do you find something like that?
And what do you do? Do you go count? Do you go interview? Right. So what do you do when you have a thought that there's some way that this general process kind of scary or whatever, and you want to gather that data. What do you do?
[00:11:55.560] - Pam Estes Brewer
So if I use this specific example, and then I'll try to maybe do another one as well, we're going to run the pilot with a new grant process in this case through our medical school, I think. And we are going to then compare it to the metrics that we already have on the process that we've been using. And so then we're going to have metrics about how the software and actually the way we use the software, how the new way has an impact on the old way in terms of amount of time spent, problems reported, deadlines that are accomplished early. Those will be some of our metrics.
[00:12:43.290] - Liz Fraley
Metrics. I can't get away from that. Metrics in tracking.
[00:12:46.500] - Pam Estes Brewer
No, absolutely not. And you don't want to. Know.
[00:12:49.350] - Janice Summers
No, you don't want to.
[00:12:50.100] - Liz Fraley
No, you don't.
[00:12:52.890] - Janice Summers
I just kind of want to back up a little bit. Part of this is you've got a tool in your toolbox right now that's not really being utilized, because you haven't been fully trained on it. Or people haven't adopted the power tools, reviewed a process. So therefore, the process has grown and gotten cumbersome. So part of this is also the difference in learning how to use your tools.
[00:13:29.080] - Pam Estes Brewer
Yeah, but like you'd expect in any large organization, there are a lot of reasons for why that's not happening. Because, oh, yeah, we missed that. There are a lot of reasons why it hasn't happened.
[00:13:43.090] - Janice Summers
Yeah. But I mean, that's not unusual for companies like the problem, problem, problem. But what do you have in your toolbox already, right? And has that really been fully utilized like take emotion out of it? Because sometimes, some people get emotional about tools, but take emotion out of that. And look at the capabilities from just a raw perspective. And does it have the capabilities to help you? And maybe it's a training issue.
[00:14:12.490] - Pam Estes Brewer
And collect the data, because I can tell you that this issue is hugely emotional here at this university. And so we're taking a solution and data driven approach.
[00:14:28.250] - Liz Fraley
And is that helping with convincing people to be part of the pilot?
[00:14:32.780] - Pam Estes Brewer
[00:14:36.220] - Liz Fraley
We look forward to seeing how that turns out. So I want to go back to the other thing too. When you get started, you got to some position and start gathering data. As a practitioner who is not... Practitioner typically not rewarded for doing full on research, because they've got day to day deliverables and metrics. They have to meet in another scale, and that's typically not included.
[00:15:06.730] - Liz Fraley
And you know me, I have any number of crazy ideas. And I've come to you with many of them trying to convince an academic, hey, I'm noticing something. You know, my first step was always, hey, I've noticed something. Do you think this is worth pursuing? Should practitioners be forming relationships with academics? Are there academics who are more inclined to spare a moment here and there, and help practitioners out? Or is that the right should not bother you?
[00:15:41.950] - Janice Summers
I think that's a good question.
[00:15:44.350] - Pam Estes Brewer
No. You should absolutely network. I mean, that's right. That's one of our oldest guidelines in terms of being in business. But remember that educators so academics, it's part of our job description that we need to be publishing. And so you will find a lot of academics who would be interested in such collaborations. You may not need that collaboration. Or what you may need is just a few hours of conversation here and there to help get yours rolling, so there could be different levels of collaboration.
[00:16:25.990] - Pam Estes Brewer
And certainly, there may be plenty of researchers out there who aren't interested. But I would think there are far more who are interested. And in tech com, we're a very practical field. I think we lean more towards desiring collaboration. And I think that the new folks coming out of PhD programs, so they're going to be our new academic in tech com. I hear them talk a lot about practitioner collaboration, so I think it's a healthy thing.
[00:17:04.450] - Liz Fraley
Is that a change you've seen over the last bunch of years or so? Is that new that they're coming out, they really want to partner?
[00:17:13.150] - Pam Estes Brewer
I can't answer that, but I can say that 20 years ago, I didn't hear much conversation about this at all. And it's been building, and of course I've been part of that ...to build those connections for a decade or more. But it's...it's building. There's just a great deal of interest in operation. But you don't have to have that either.
[00:17:38.350] - Pam Estes Brewer
All you need is some basic knowledge to be a researcher or simply as a leader in your organization. It it's not that hard to do. You just need a few basic rules.
[00:17:52.240] - Janice Summers
So what are the few basic rules? How would lay that out?
[00:17:57.990] - Pam Estes Brewer
I was going to tell y'all. I had held up a book before, and I haven't done this before in a Dojo or in a Room 42.
[00:18:07.660] - Janice Summers
This is important. I want everyone to pay attention.
[00:18:12.280] - Pam Estes Brewer
This is a Research Primer for Technical Communication. Second edition just came out, and I'm a coauthor. And it's not just for technical communicators. In fact, we've had people requesting that we publish this book without the emphasis on technical communication. It's a beginner's guide to doing basic research. And so I have great faith and pride in it, which is why I brought it up here.
[00:18:45.550] - Janice Summers
I think this is important. Can we put that in the chat for everybody? okay. And they can find this at a bookstore, online.
[00:18:55.390] - Pam Estes Brewer
[00:18:56.200] - Liz Fraley
[00:18:57.040] - Pam Estes Brewer
So Hayhoe and Brewer
[00:18:59.620] - Janice Summers
Can you just tell me the title of the book?
[00:19:02.560] - Pam Estes Brewer
A Research Primer for Technical Communication. That was the title on the first edition, and we didn't change that.
[00:19:09.610] - Janice Summers
[00:19:11.380] - Pam Estes Brewer
And you contact me if you've got questions when you want to use that book. If I've left something out or I know if Dr. Heigh-Ho perceives we've left something out, we'd love to hear it anyway.
[00:19:23.650] - Liz Fraley
Well, and you've been an advocate, the book is very readable. You've been an advocate for making research, writing accessible to practitioners. So I'm not surprised you're writing to help practitioners make more contribution from a research perspective.
[00:19:44.260] - Pam Estes Brewer
[00:19:47.440] - Liz Fraley
So anybody wanted to do stuff, Pam is accessible. You should do that.
[00:19:51.420] - Janice Summers
Well, I find a lot of people we've been talking to a lot of academics and a lot of published researchers, and I find all of them are very accessible. I don't think anyone should ever be afraid to reach out, and try and find where's Pam, and reach out and send you an email directly. I know as a tribe, they tend to talk about how shy they are, and it's true introverted. But it's a group of introverts, so you can be a little extroverted or outgoing in a group of introverts.
[00:20:26.480] - Pam Estes Brewer
I'm not an introvert.
[00:20:28.310] - Janice Summers
No. but yeah...Yes, that's true. But there's a lot of people in the field. I mean, really, honestly, in the field, it does attract a lot of people who tend to be a lot very thoughtful and more introverted. And they think a lot, so they don't tend to go out a lot. Or they don't feel as comfortable reaching out or shouting out.
[00:20:54.860] - Liz Fraley
What other channels. So sometimes, networking can be hard for people who now know how to approach someone or what to say. But I know too, that you gave me another channel. And I'm sure you can come up with more. When I came to you with some crazy observation, you know, write it up, and submit it to the... I'll read over it for you. But submitted to a journal, and see if anybody responds to you. And if anybody else is also seeing this problem since a way of putting it in front of other people, and inviting them to contact you rather than you having to go out and call someone else.
[00:21:29.390] - Pam Estes Brewer
Yeah. So in your organizations that you belong to, among your colleagues within your organization, joining a new professional organization. If you've got some you're interested in great ways to vet those ideas that you've got, and see if you hit someone else with that same interest.
[00:21:52.490] - Liz Fraley
Excellent. So it seems like a good channel to academic practitioner partnerships. Any other ideas, sort of random thoughts you have about ways for practitioners and academics to connect?
[00:22:10.110] - Pam Estes Brewer
Well, the Society Technical Communication and IEEE Professional Communications Society, those are both places, right the docs, some of the blogs in our field. Those are some good places. Tom Johnson's blog is a very popular one right now, but there are a lot of others. And those are all places, you know, blogs of interest where you might get in touch with someone with similar interests. Or they might say to you, oh, yeah, you should read X, and then that opens up a new avenue for you of names.
[00:22:47.830] - Pam Estes Brewer
And then I reiterate the idea that feel free to reach out to authors or bloggers and say, hey, I want to take this a little further. Can you suggest any connections for me, just like what we're doing here?
[00:23:05.550] - Liz Fraley
Nice. I will put your email and your LinkedIn into the chat, so that anyone who wants to connect, absolutely can. You've done this before. You've partnered with practitioners. You've done research in industry settings. Are there things that are hard to build those? What are the challenges of building those kinds of relationships or keeping them going? You know, I have my opinions, but no one's here for me.
[00:23:34.470] - Pam Estes Brewer
I think there are probably two barriers to entry for you doing research as a practitioner. I think one is just a complete unawareness of how to go about it or why to go about it, why you research, and how do I get started, which we've already talked about a little bit. And then the other barrier, I think is time. Everybody is short on time. This is something new to think about.
[00:24:05.420] - Pam Estes Brewer
So I would strongly encourage you not to put the pressure on yourself to just go out and find a research project right now, although maybe you have one in mind. But just to start talking about it, and that will help you to explore it with very little investment. And to begin to think about, well, how might I do it? And why might I do it?
[00:24:34.570] - Pam Estes Brewer
Yeah, yeah. I teach at Mercer University, and I teach our graduate program, which is for professionals. And when we get to research, it's never hard for them to find projects within their organisations. We encourage them to do that, because then it's a win win, right? They work on their education, and they return value to their organisation.
[00:25:00.640] - Pam Estes Brewer
And so because of that, it's just never an issue for them to see a problem within their organization or a challenge, or an opportunity that they could practice their first real reliable research on. And it's really fun to watch that. I learn so much from what they approach.
[00:25:24.550] - Janice Summers
So what are some of the tactics or techniques that they use to try and get? Because I'm sure a lot of these are individual contributors, and they need to get by in an approval. Or they can take the next step. So say they come up with an idea or a thought or observation. They've chatted with some people. They've connected with STC people or their connected with IEEE technical communicators. And and everyone pretty much has heard their idea, and they think it's a pretty good idea.
[00:25:53.350] - Janice Summers
But now they have to go internal to the company and get approval to do this research. What are some of the tactics or techniques that somebody would use? What have you seen?
[00:26:08.840] - Pam Estes Brewer
Well, just as business per usual, you would go to the whoever you directly report to first, and talk about what this might bring to the organisation. And maybe that can be part of your goals for a particular quarter. That happens all the time among my students, but maybe they're not so interested in it. It's something that you decide you want to pursue extracurricularly.
[00:26:46.580] - Pam Estes Brewer
There may also be legal implications to whatever research you're doing. And then the person you report to directly should direct you to your company's legal for anything that might be of concern there. Is that kind of what you're meaning? Janice, what's your question?
[00:27:08.690] - Janice Summers
So a lot of these people are getting approval for these projects, because it's part of their annual goals for personal development.
[00:27:21.190] - Pam Estes Brewer
Or they make it that way.
[00:27:23.440] - Janice Summers
Or they make it strong enough, so that it is a good corporate goal. So that it benefits the corporation, and they're good at portraying what the outcome could mean for the organisation as well. So as a complex situation...
[00:27:39.850] - Liz Fraley
It's that time of year.
[00:27:40.940] - Janice Summers
In academia, it's publish or perish. And in corporate, it's not that way. So they kind of it's, I think a little different dance or a little different perspective that they need to have. Because as an individual, you want to advance your career, and you do want to affect change. I think that in this field, a lot of people are very conscious about their work and about the environment, about their company.
[00:28:07.360] - Janice Summers
So there is that personal goal that I would like to change in the company, but there's also I want to look out for the betterment of the organisation. And I think that's when you position things. That's that delicate thing that you need to do. That's different, right?
[00:28:25.040] - Pam Estes Brewer
Well, in those things work together. I mean, they're not mutually exclusive. And I don't know if this helps. One of my students, she's an alumni now works for a medical firm that provides the mechanisms and the tools for doing hip replacements. And of course, they have to have their user materials packaged with that machine, and they have to meet government guidelines, validation that that documentation works as well as that equipment does.
[00:29:07.740] - Pam Estes Brewer
So in this case, my student did a study to test not just whether or not that user documentation was valid, but whether she could suggest a process for validating user documentation every time. In other words, if I do it this way and it worked, can we then suggest that everybody within, I think I can use their name in Striker can use this and be assured that they validated their user documentation.
And so that was for research, and and she designed it. She heavily used usability research to do that, and came out with an excellent product.
[00:30:02.320] - Liz Fraley
You know, that was not surprising. I'm just going to throw this in. I did an interview with a guy at Medtronic HQ who started his project similarly. But he started at 99, so it's 20 year old project over there. But they could prove that what in came out and their tech limitation process is a class three medical device. So there's a lot to that. It's really interesting, but he doesn't do the circuit anymore. Because his products 20 years old, and he's moved on to more interesting things.
[00:30:35.290] - Liz Fraley
Digging that stuff out is not easy.
[00:30:38.170] - Pam Estes Brewer
No. And I could tell you I dug it out. But I've got to prove to you I dug it out. I've got to have evidence to back up that claim.
[00:30:52.960] - Liz Fraley
All right, so I've got one question that's been, you know, it's always on my back of my mind anyway. So somebody comes up with something. They get started. Practitioners being under time constraints. Typically, they're not going to do the same rigorous study, but they might do a certain like a level of study that's acceptable, and it can be built on by other researchers. Where do you see where do you see those boundaries? So I surveyed 20 of my friends or I surveyed 27 people I know.
[00:31:26.020] - Liz Fraley
I survey 50 people in my professional organization. Where are those boundaries for practitioners? Something that makes a good starting point for other researchers. Where researchers really begin like. What's that scale look like? So as a practitioner, I can say, you know, I'm doing reasonable research for me at my level, because I've asked this many people.
[00:31:55.810] - Pam Estes Brewer
Well, there are two different ways to use your research, and that is descriptive or inferential. When I do descriptive research, I'm essentially seeking to describe something within a very small realm, a very small population. And the other type is inferential statistics, where I say, hey, I have constructed this research in such a way that we can infer that a much larger group would perform the same way as a smaller group. People always want to know. Okay, so what number is that?
[00:32:44.760] - Janice Summers
[00:32:47.640] - Liz Fraley
Did I just ask that?
[00:32:50.730] - Pam Estes Brewer
But the truth is there is no number for that among statisticians. You would get all kinds of debate, because that depends on, okay, how are you studying it? You want it to apply to, who did you study? And all of those things would play into how large and what kind of sample do you need between going descriptive and between going inferential. And don't get me wrong, descriptive is not less important than inferential. Descriptive is really important, and it needs to be done well.
[00:33:31.620] - Pam Estes Brewer
So for example, I could make claims about whether my new website actually does improve the performance of users. I'm trying to infer everything about everybody else. I just want to know if the users of my website are improving. And so I'm describing how people respond to my website.
If I want to make a solid claim, I need to come up with some metrics that are statistically significant. And that's really important research. And that's probably, I don't know, maybe that's the more important research.
[00:34:12.750] - Liz Fraley
And is there something we should strive for in order to do good research that way, because we're not schooled research.
[00:34:20.340] - Pam Estes Brewer
No, no. And you don't have to be. You don't have to be to get started. You can do a good job just by starting small. And again, this book that we've done on basics is one way for you to do that. Some real common ways to get started are with usability tests or with surveys and with performance, studying performance. But a lot of times that's done within usability. So those are some really easy, common, useful ways to get started in research and practice, sort of monitoring yourself.
[00:35:02.520] - Pam Estes Brewer
Okay, am I doing this right? Is it reliable? Is it valid? Yeah, you're shaking your head.
[00:35:09.580] - Liz Fraley
Well, am I observing myself? Is it valid for me? It's usually no. But that seems very accessible, like anyone can do that. You can observe what you're doing. You can keep records of what you're doing. You can track anything and everything. How many calls you got related from various groups and track, which groups call you the most, right? Or which groups stop by your desk? Or which groups send you an email? And how many use the phone? And how many use the Slack channel?
[00:35:39.090] - Liz Fraley
You can track anything, track anything.
[00:35:42.840] - Pam Estes Brewer
And in a lot of times in universities, we want to know is this technique working with students?
[00:35:49.860] - Liz Fraley
[00:35:50.580] - Pam Estes Brewer
People want to write about it. People have like a class, and it goes super well, and they want to write about it. Well, then we got to ask ourselves, did you collect your data systematically? Did you analyze your data systematically?Because maybe they were still in the abstraction war phase? Hey, I like that? Okay, why did you like it?
[00:36:10.440] - Pam Estes Brewer
Well, I like it, because but now we got to go a step further than that, you need to get the data systematically. And then analyze it, and report it systematically. Yeah. And I was just going to say, you posted the successfully remote dotcom, and it is still not live. Yeah, I tend to get in too many things, recruiting very soon.
[00:36:34.980] - Liz Fraley
I know how that goes. I never met a project I didn't want to be a part of. So instead, you can get her book about working remote, because that's been around since 2015. I'll put that in the chat window too. Awesome. Wow, that's a lot of great places to start. And it certainly doesn't feel as daunting to practitioners who aren't necessarily rewarded for it.
[00:36:58.850] - Liz Fraley
It can be once they raise it to a certain level, and show that it's about the organization success and improvement and efficiency or whatever it is. Whatever it is you're tracking against, so that you can improve the thing you're tracking against.
[00:37:17.660] - Pam Estes Brewer
So start small and start systematically.
[00:37:24.200] - Liz Fraley
Trying to convince your coworkers to observe the same things and hope that after they do. Awesome. See, I think we had maybe one question that came from the crowd. Well, so here's what we didn't quite get through all this. When you're building, when you're working academics pairing up with practitioners, are there hurdles to sustaining those relationships? Do petitioners expect results faster? Or is it all relationship building?
[00:38:04.290] - Liz Fraley
Are there... Anything? Any advice, ideas about sustaining long term relationships with academics?
[00:38:13.280] - Pam Estes Brewer
I think, it would come down to good teaming. If you're going to collaborate, you need to have some pretty frank discussions about what your goals are, and do they complement one another. And what are your communication expectations and production expectations? Have those conversations, and then you can know whether that particular collaboration is worth a shot or not. You think it might work. I mean, I work very well with some folks, and then not so well with others just because of the way we work.
[00:38:57.230] - Pam Estes Brewer
Right, personalities and style. Doesn't mean it's okay to to talk with several researchers? But you know yourself, and get to know who you're trying to partner with. And find out what works best, and kind of removing the emotion because we don't all work the same way. And sometimes, it's better to find someone who's complementary, someone who's not just like you, somebody who can kind of go toe to toe and challenge in a positive way. But challenge your suppositions, and make sure that you're staying on track. Not a bad thing.
[00:39:37.290] - Liz Fraley
And there's no issues really with shopping it around, right? It's not just one idea, and this is going to make or break the entire world's worldview. You know, I mean, you can talk to multiple practitioners. You can talk to multiple academics. You can tell them what you're doing, right? There's no millions of dollars at stake. You know, even if they hear your idea, they might go off in a different direction.
[00:39:57.960] - Liz Fraley
Or even if they do the same thing you're doing at least, you get the research done, right? And you didn't have to do it as a practitioners at least. I know that a lot of times, people don't like to share their ideas because when I got a super secret, I can't share that, because someone else will take it from me. But that academics are well aware of issues like that, and nobody's going to do that to you.
[00:40:21.880] - Janice Summers
No, And don't you think that if multiple people are researching the same type of thing, you can pull all that research together as a culmination, like, those worlds can collide. You can pull it together and say, okay, this found it this way. This found it this way. Here's the similarities in the findings.
[00:40:42.950] - Liz Fraley
That's how we grow.
[00:40:43.410] - Janice Summers
That in and of itself is a valid research. It's compiling research that's already been done, and drawing the conclusions based on valid research that's already been done, right?
[00:40:59.260] - Pam Estes Brewer
[00:41:00.850] - Janice Summers
[00:41:01.570] - Pam Estes Brewer
You review what other people do, and I might have a great idea that I just can't get to it. Yeah, so somebody else picks it up. Oh, cool, they're going to take it somewhere that's going to help me. And then maybe I pick it up again, but at a different point. So I don't think there's much need for protection, because of our ideas. I think we're much more likely to profit if we share.
[00:41:30.220] - Janice Summers
Well, you know, it's an unrelated field. You hear this a lot with with people who create music and art, and they do a song, and the song has been done this way. And everybody knows it that way. And then somebody comes along and just changes it and shifts it, and you see things in a whole other light. So you might have an idea, but it's kind of like art. You put it out there. It's okay, because even if somebody runs with the idea, they're not going to run with it the same way.
[00:42:02.780] - Janice Summers
But you've got multiple people running towards looking and investigating this hypothesis, and applying research methods to it. So in the end, when you come to your conclusion, they'll come to their conclusion. And again, you can pull it together into something that you would not have if you just tried to hold it so tight. Yes. And and own it, right.? I think that's the thing when you're talking about research and things, is you let go of ownership, don't you?
[00:42:36.540] - Janice Summers
When you have a theory about something, you kind of let go of the ownership of it, because now, you want to explore it and share it with other people and explore it. In order to explore it, you have to share it. Does that make sense?
[00:42:52.220] - Pam Estes Brewer
Yes, it absolutely does.
[00:42:57.640] - Liz Fraley
All right, awesome. And our time is up.
[00:43:00.860] - Janice Summers
[00:43:02.570] - Liz Fraley
I don't know how that happens. All right, awesome. Thank you, Pam for sharing so much information and making research approachable and accessible to practitioners. I think, we're all a little inspired to at least try something, if only observing ourselves, and maybe writing up a little thing. And networking some ideas the next time we meet to talk to people.
[00:43:27.250] - Pam Estes Brewer
Well, it's it's been my pleasure. And if you've got any questions, let me know.
[00:43:32.710] - Liz Fraley
That's right. Connect with her on LinkedIn, send her an email, and get the book. Don't forget the book.
[00:43:38.590] - Janice Summers
I remember, start it out... What was it again?
[00:43:40.690] - Liz Fraley
[00:43:42.700] - Pam Estes Brewer
Small and systematically.
[00:43:46.030] - Janice Summers
Systematic. There we go.
[00:43:47.610] - Liz Fraley
[00:43:49.830] - Pam Estes Brewer
[00:43:50.300] - Liz Fraley
Thank you, Pam. Thanks, everyone for coming, and we'll see you next time.
[00:43:53.640] - Janice Summers
[00:43:54.850] - Pam Estes Brewer
In this episode...
Pam Estes Brewer, Ph.D., Professor at Mercer University, is a technical communicator, educator, and management consultant. She has spent the past 15 years researching and publishing on remote teaming, including international teaming and her book, International Virtual Teams: Engineering Global Success, was published in 2015. She teaches in Mercer University’s School of Engineering and directs the online MS in Technical Communication Management, the Mercer User Experience Lab and its work with such organizations as the Department of Homeland Security. She serves as an Associate Editor for IEEE Transactions on Professional Communication and as a board member for the Wesley Foundation of Macon.
Research and publication in the field of technical communication largely takes place within universities where folks are trained in how to produce reliable and valid research. However, much of the innovation in the field takes place in industry, among practitioners who are working out on the front lines. These practitioners often have not had training in how to conduct reliable and valid research that can be published and/or instigate needed changes within their workplaces. In this Room 42, join Pam Estes Brewer to discuss how you might take your research ideas and implement research that is both reliable and valid—research that can be published, build change that is needed in your organizations, and support your career growth.
Hosts & Guests
Pam Estes Brewer
Her book, Hayhoe, G. F. & Brewer, P. E. (2020). A Research Primer for Technical Communication (2nd ed.), Routledge.
Professional groups for networking :