How to Understand and Provide Praise and Criticism at Work

The issues of praise and criticism in the workplace are especially important for employee morale– after all, it feels bad to be criticized and feels good to be praised. The effects of praise and criticism are cumulative, so each must be given carefully and in a targeted, effective fashion. Praising irrelevant or inconsequential attributes of a coworker’s work won’t be as effective as choosing the correct target. By the same token, we all know that feelings of indignation and hurt occur when we feel that we have been criticized unjustly. Of course, we may not be so happy when we receive accurate criticism either. This article is my attempt at biting into the concepts of workplace criticism and praise, attempting to tease out the actual psychological phenomena, and offering a constructive path forward that will provide superior quality communication.

First, let’s define criticism and praise. Criticism and praise are after-the-fact identification of priorities, effort invested, and outcome accomplished relative to prior expectations. Praise is an observation that the ordering of priorities, effort invested, and outcome accomplished were more successful than expectations beforehand. Criticism is identification that priorities were not what was expected, and as a result the effort invested may have been insufficient or misplaced, leading to an unexpected outcome that fell short. Neutral observations that are neither exactly criticism nor praise are likely to be identifications of unexpected priority placement or effort investment which did not have an explicitly positive or negative outcome.

By this definition, the two concepts of criticism and praise are in fact the same concept popularly called “feedback” in the corporate doublespeak. I don’t like the term feedback because it’s nonspecific and is frequently a euphemism for criticism because people are afraid of the word itself due to its emotionally harmful connotations. The fact that the word “criticism” has become verboten is an indictment on the disastrous state of communications skills in corporate life. Discussions of workplace priorities should not spur anxiety within employees, yet it is so. The knowledge of employee discomfort over receiving criticism has spurred the creation of many different investigations into various aspects of criticism, but many employees still struggle.

We should not fear criticism at work– criticism is merely a type of social signalling which indicates that our work priorities were inconsistent with what was expected by others. Adopt a detached mindset, and accept that if we never received praise or criticism because our priorities were always exactly in tune with everyone else, we would be closer to ants than humans.  We should not fear praise, either!

An inability to accept praise or a rejection of praise at work is merely a fear of admission that individual priorities were not the same as what was expected. A fear of criticism is frequently mirrored by a fear of praise because both pertain to individual deviation from expectation and thus a violation of social conformity. It is human nature to be conformist, so we can forgive an inbred tendency to avoid ostracization from the group, but we must overcome this tendency if we want to be part of a maximally effective team or organization.

Effective teams and organizations have a shared frame of priorities, which means that identifying deviations from those priorities is important for keeping on the right track. In this sense, we actually need a certain minimum amount conformity in order to accomplish our group’s goals. With that being said, I am of the opinion that too much conformity is typically far more harmful than too little— a team that is incapable of deviating from expectation is stagnant and inflexible.

So, how do we deliver criticism and praise in such a way that the people we deliver it to get the most helpful impact? The biggest unstated misconception that I regularly come across is that criticism and praise can be doled out without reference to the receiving person. I would like to rectify this misconception, perhaps controversially: the most effective criticism or praise will be carefully calibrated based off of what the receiving person prioritized when performing the work. 

Let’s unpack that statement. In order to get the biggest psychological impact in the desired direction (more efficacy and team cohesion), we have to understand and empathize with our coworker. We have to get into their head.

Why do you think they prioritized what they prioritized, and does this explain the outcome? What aspect of their work did they seem to have put the most effort into, and what part do they seem proud of? Do they seem anxious, ashamed or avoidant of certain prioritizations or aspects of their work? Why would they feel this way? It helps to have the coworker reiterate exactly what they think the expectations were for a given project.

Identifying insecurities regarding the work in question is a good starting point if the above questions are inscrutable. Frequently during discussions of their work, people will provide clues which indicate that they suspect their actual prioritizations are different from the expected prioritizations that may have been agreed upon at the start of a project. Suspicion of differing priorities does not mean that the person should be criticized! Frequently, refutations of expectation are positive, and are indicative of individual initiative and creativity. Individual initiative and creativity have their time and place, however; certain projects may be too sensitive or intolerant of deviation for an individual’s flair to have a positive impact.

Once you’ve identified points where a coworker’s prioritization or effort invested deviated from the original vision of the team, you have identified a point for criticism or praise. Examine the outcome compassionately: did the coworker’s choice seem as though it would be fruitful at the time? If there was really no need or leeway to reprioritize, and the outcome was worse than what was expected, they have earned criticism because it was the incorrect time for their creativity. Was the unexpected investment of effort fruitful in a surprising way while still accomplishing the original desired outcome? Time for praise.

The trick is to keep your criticism and praise limited, detached, and extremely topical. Find the points of individual initiative that the coworker took while working. If your coworker prioritized the wrong thing which led to a bad outcome, detail the logical chain for them if they aren’t aware that there was a problem. Did recalculating the sales from November waste valuable time that could have been spent compiling those sales into charts? Say so clearly and gently, giving your coworker acknowledgement for creativity but not shying away from the problem: “Though you are right that it’s essential for our data to be correct, prioritizing recalculating the sales from November instead of compiling those sales into charts led to a duplication of previous work which contributed toward us missing our deadline.”

Praise should follow the same formula, provided that the outcome was acceptable.  “Choosing to prioritize recalculation of the sales data over compiling the data into charts allowed us to catch a number of mistakes that we would not have otherwise.” Keep both praise and criticism impersonal! The objective of evaluating your coworker’s work is not to quantify their worth as a human being or “human resource” but rather to identify where their individual decisions were compatible with the objective of the team. Accept their choices as compartmentalized pieces on a per-project basis, then look for trends later on if you’re inclined.

Tone and body language are critical to giving and receiving praise and criticism, too. Because of how uncomfortable people are discussing deviations from expected priorities,  defensive body posture and clinical prescriptive tone occur very frequently on both sides of the table when evaluation time comes around. Making a conscious effort to avoid these harbingers of poor communication is absolutely essential! People will detect defensive or vulnerable body language and tone and mirror it when they piece together that criticism is inbound.

Instead, opt for open body language. Signalling warmth and having a benign disposition helps to prevent the other person from clamming up into a defensive posture and allows for praise and criticism to be fully analyzed without emotion. Tone of voice is a bit harder to remember to regulate, but should be carefully considered as well.

Praise should be delivered with a positive and serious tone– adopting a nurturing or parental voice is the most common mistake here. Workplace praise is not the same type of communication as praising your dog for returning its toy or your child for a good report card; workplace praise is clear-sighted objective recognition of successful individual task reprioritization. Praise for a good outcome is not personal, and shouldn’t be confounded by a friendly office relationship.

Criticism should also be delivered with a (slightly less) positive and serious tone. Remember, the purpose here is not to tear the other person down, or talk down to them, but rather to show them that their priorities caused outcomes that were not consistent with the team’s original purpose. Criticism should be delivered at normal speaking volume, and abstracted far away from any frustration you may feel.

A frustrated tone from you will cause the other person to grow defensive, and the maximum positive impact of criticism will not be achieved. A tone of simpering or crestfallen disappointment when delivering criticism will not do: personal emotions or discomfort are not relevant to the discussion of expected priorities and outcomes. Emphasize hope for the future, and move the discussion toward steps for next time around.

I hope you guys enjoyed this piece; I know that I struggle quite a bit with giving and accepting praise, so this article was enlightening for me to think through. Follow me on Twitter @cryoshon and be sure to check out my Patreon page if you like the stuff I’m writing!

 

Advertisement

How to Disagree Intelligently At Work

One of the large differences I see between technical/scientific people and laymen is in the communication style that critiques or disagreements are offered. Disagreeing with other people in an effective and respectful way is an extremely difficult skill that takes considerable guts to practice.

For the most part, people find disagreeing with each other as difficult and uncomfortable, and use watered-down and less effective language as a result. Some people have the opposite problem, where they are too willing to disagree with others tactlessly without really considering why they disagree in the first place. The prior style of disagreement leads to miscommunication and unfixed problems, whereas the latter leads to bruised egos and frayed team morale.

There’s clearly a big incentive to disagree effectively. Needless to say, there are many possible ways of delivering the sentiment of disagreement correctly and incorrectly. Certain personalities and dispositions are biased toward certain disagreement styles. The scale of disagreement matters too, as a technical dispute may be easier to resolve than a philosophical spat. This article pertains to both of those disagreements, though I think the technical spats are generally easier to resolve quickly as it’s possible to conduct experiments and determine which technical option is better. I’ll characterize ineffective ways of disagreeing and offer a few smarter methods in this article. First, it makes sense to elaborate on exactly what disagreement is in a professional context.

What is disagreement? I will define disagreement as an inconsistent opinion between parties. If an opinion is consistent among all parties, there is consensus. In a professional context, disagreement is a communication modality that is  found within teams or pairs of individuals. Communication modalities are fluid, and are emergent from the interactions between individuals that make up a group. A group made of particularly cantankerous individuals will likely be in the modality of disagreement far more than a group of shy people. Do not take this as a suggestion to form teams of compliant people: disagreement is how bad ideas are destroyed before they cause real damage, and a team is empowered by strong ideas. Disagreement can be essential pruning when used properly.

So, our understanding of disagreement is that  it’s a pattern of communication resulting from inconsistent opinions on a given issue. Much of our time spent in meetings is actually spent trying to jostle the group’s current communication modality from  disagreement to consensus. We may even decide to form groups based off of how much or little the members of the group are likely to have internal agreement or disagreement, though an excess of either is likely to be harmful for the actual output of the group.

One of the functions of leaders in the workplace is to try to circumvent a state of disagreement via executive action– though a definitive ruling will typically allow for work to continue despite the disagreement, it rarely actually resolves the dispute at hand and is frequently akin to the ego-bruising too-direct style of disagreement in terms of damage caused to the team.

Instead of promoting coping strategies for leaders to use in an attempt to ease the pain of being overruled, I think it’s much more effective for leaders to ease disputes via consensus building rather than default to authority’s power. Part of moving the team from disagreement to consensus is  accepting that opinions are malleable and subject to extreme change under the right conditions.  In order for the leader and group members to move toward consensus, effective disagreement is critical.

Ineffective disagreement:

  • Uses personal attacks against others, even if they aren’t present
  • Prompts negative defensive reactions from others via indirect criticism or passive aggression
  • Appeals to office politics or the sanctity of individual fiefdoms
  • Denies or neglects unchangeable frameworks or obstacles
  • Asserts incompetence of other people or groups that will be relied on, even if it’s true
  • Denies attempts to refine points of disagreement
  • Dismisses disagreement as irrelevant without explaining why
  • Breaks group up via factional lines instead of individual opinion
  • Is delivered shortly, bluntly, and without true consideration of the facts
  • Does not rally facts and data to support statements
  • Is delivered agitatedly or emotionally
  • Assumes negative reaction to disagreement from others before it’s actually given
  • Fills in details of opposing arguments without having explicitly heard them
  • Is overly general or lacking specific articulate criticisms
  • Can be reduced to “my gut feeling doesn’t like this”
  • Paves over or ignores opposing viewpoint when convenient
  • Detracts from importance of the issue in disagreement rather than address the disagreement itself
  • Expects authority to effect a particular action regardless of discussion
  • Is delivered only because it is expected by superiors
  • Surrenders quickly due to discomfort associated with disagreeing
  • Plays devil’s advocate wantonly or without purpose

Effective disagreement:

  • Maintains an open mind and pliable opinion
  • Accepts that disagreements can be resolved via changing of opinion
  • Does not assume personal correctness of opinion
  • Does not shut down discussion before the group has agreed to stop
  • Is delivered after considering the merit of the opposing viewpoint relative to the facts and data at hand
  • Is delivered coolly with constant reference to established facts and data for each statement
  • Attempts to refine points of disagreement between parties first, then resolve disagreements second
  • Seeks to actually change opinion of disagreeing parties to reach a consensus that all parties will agree is the most effective path forward
  • Does not respond to emotionality or passionate arguments, preferring impartial consideration
  • Accepts disagreement as an essential and positive part of team functionality
  • Assumes good will and common goals of people with different opinion
  • Understands the personalities and thought processes of the people supporting the opposing opinion
  • Tenaciously argues for opinion, but accepts defeat when clearly outmaneuvered
  • Accepts that there is usually no moral content of disagreement in the professional context
  • Does not build grudges or allow tainting of discussion by grudges based off of disagreements
  • Is scientifically detached from both the issue and the individuals at hand
  • Is blind to status and applied equally
  • Is delivered respectfully, directly, without personal attacks or passive aggression
  • Is delivered in neutral language, in a neutral tone

There’s quite a bit to keep track of here, so I’ll summarize the biggest points of each quickly. Ineffective disagreement is emotional, argumentative, judgmental, fact-free, loud, and political. Effective/intelligent disagreement is data-driven, neutral toned, open minded, inquisitive, and status-blind. Of course, getting yourself and your team to disagree in an effective way is easier said than done, as many people have been disagreeing ineffectively for a lifetime. The colloquial pattern of disagreement is easy to fall into, but has no place in a work environment because it’s an expression of emotion rather than an attempt to navigate a path forward.

Delving into resolving disagreements, I highly suggest that you understand your own opinion on the disagreed-on issue completely. Write your opinion down, and think systematically! Most of the time, our opinions are not nearly as clarified or explicit as we would suspect. Very frequently, clashing opinions are a result of unclarified thoughts that lie in between premise and conclusion. The human brain has a fantastic ability to sketch an idea’s outskirts, then trick itself into believing that the interior is filled with detail without actually investigating each wrinkle. Upon examination of the area in between the edges, we find that our idea isn’t really as developed as we had initially hoped.

Referencing data and forcing a step by step compilation of an opinion’s logic is one of the strongest tools for evaluating ideas, and is an essential tool for smart disagreement. If an opinion is fully developed and linked to supporting data, it is easier to positively assert that the opinion is correct and also easier to refute clashes with other opinions. If an opinion is fully thought out and linked to data, it will usually be more persuasive than an emotional opinion and allow for a faster resolution of disagreement.

In the laboratory, the way to resolve certain disagreements of fact was to conduct experiments. The results of the experiment would clarify which opinion was correct, and instantly catalyze a consensus. Of course, there was always the chance that the data from an experiment would raise new disagreements and questions, but this too was a welcome consequence, and moved the discussion forward.

Conducting experiments to resolve disagreements may not always be possible in a work setting, but sometimes a thought experiment or hypothetical experiment can be helpful in clarifying opinions. If the path through a jammed disagreement isn’t being loosened by talking through the logical steps and evidence for each opinion, try an experiment. I’ve discussed how to conduct an experiment in a work setting in my previous post.

I find that being more in tune with emotions and personal state of mind helps to disagree more intelligently. As out there as it may sound, a lot of team disagreements over otherwise trivial issues are born from outside stressors. If a person is stressed out or otherwise emotionally run down, their disagreement style will trend toward the “ineffective disagreement” list. Defensiveness, emotionality, and reactivity are far more likely to crop up. In this sense, ineffective disagreement can be a symptom of other problems in the work environment.

The companion post to this will probably be discussing how to agree effectively in the workplace– easier said than done, I think! I may also revisit this post at a later time with special attention to office politics and personal fiefdoms, which I have found to be particularly poisonous for team cohesiveness and effective disagreement.

If you liked this article, follow me on Twitter @cryoshon and subscribe to my email listing. If you want me to write about something in particular, tweet me and I’ll give it a whack!

I’ve also just launched a Patreon page, located at http://patreon.com/cryoshon, so be sure to support me if you like the stuff I write here!

 

How to Improve Work-Stuff, Scientifically!

One of my favorite tasks to do when I’m at work is to find ways of optimizing workflows, actions, or processes that I’m regularly doing. If you do something multiple times per day or week, it’s worth doing it as well as possible, right? In my experience, most tasks or workflows are created thoughtfully, but then executed relatively automatically, and, over time, thoughtlessly. Sure, if you have a workflow that’s deeply detail oriented or requires a lot of conscious, brain-on-task time, you’re likely to be mentally active while you execute it, but actually thinking about the efficiency of the process itself may not  be on your mind.

Sometimes I set aside time for process improvements, but usually I fit it into a block of time that I don’t have slated for anything else. Depending on what kind of work you do and what kind of improvements you’re seeking to make, making a change to your process may require a lot of paperwork. If making changes to your workflow or process will require a lot of paperwork, it’s still worth at least investigating whether you can make a change, but the bar for what criterion you use to select your change will probably differ, as it makes more sense to fix a ton of small changes or radically re-haul the process entirely.

When optimizing work, take care to not disrupt old dogmas willy nilly. I propose investigating your workflows scientifically, and determining which optimizations to make scientifically as well. This means that the technique for optimizing workflows I’ll be discussing in this article will be suitable for some kinds of work, but not others. Additionally, my scientific way of investigating beneficial changes may not operate properly for every type of work.

How do you select a process for scientific optimization? The following points are a good guide to seeing whether or not your process can be improved scientifically:

  1. Measurable outcomes and rigorous metrics. In order to think about optimizations scientifically, we need to be able to quantify the pieces we’re talking about. A manufacturing process that produces 15 yellow cubes in 1 hour is an easy candidate for scientific optimization because changes to the process will alter the number, color, or time it takes to produce the cubes. A painting technique that is used to produce impressionistic portraits is not a good choice for optimization, though with some time invested into making qualitative rubrics it may be possible.
  2. Empowerment to experiment. Everyone has bosses, and not everyone’s boss is going to be keen on experimentation with company assets. Having supportive co-workers and bosses is essential to experimenting with process improvements. Bosses may be scared away from the scientific optimization process because it’s resource intensive. Others may be scared due to their own insecurity with scientific pursuits, which tend to be perceived as complicated. Aside from clearance to experiment generally, some processes at work may be open for reinterpretation, whereas others will be sacred and untouchable.
  3. Non-catastrophic failure. Experimenting with the workflow that props up an entire business is sometimes necessary, but should be avoided if it can’t be done safely. The last thing an employee should do is destroy an already-functioning process by attempting to improve it. For some workflows, safe experimentation isn’t possible without the potential for massive fallout if things go wrong. In these cases, making a smaller model to play with typically isn’t possible. I suggest you avoid playing around with systems that will have bad consequences if they fail or have null results.
  4. Controls and Variables. If you’re really going to be conducting a scientific evaluation of your workflows, you need to have the ability to create controls and variables for your investigation. This means that it must be possible to keep the majority of your process the same while changing small pieces individually. Additionally, you need to have data for the way that the process behaves under normal, non-experimental conditions. Most workflows have a paperwork component of some kind, so this is a great place to start looking for control data that you can compare your experimental data with after you’ve run your experiment.

Now you know how to evaluate a process for scientific optimization, so let’s dive right into the meat of how to actually run an experiment once you’ve picked a process to change.

  1. First, if you haven’t already, decide what your variables will be. Remember, you should only be investigating one state of one variable for each trial in the experiment. The variables you pick are up to you, but keep in mind that the items you pick as variables are the items which will end up being improved by beneficial changes to the process that you discover after the experiment is over.
  2. Next, decide your controls. The controls are the most important part of getting usable data from the experiment. I suggest having a negative control (the process as executed before the improvements proposed by the experiment). If you want to get fancy and your process permits it, I’d also add in a null control (a control designed to terminate the process from moving forward) and a positive control (a control designed to test your ability to detect positive results and gather data), but these aren’t strictly necessary.
  3. Once you have decided your controls and variables, it’s time to write up an experimental protocol. How will you be isolating your controls from your experimental group? How will you be altering your variables and setting up your controls? How will you be changing your variables? What will your output look like? How will you be measuring the results of the experiment? How will data be presented in raw form? This is the hardest step and also the most risky step, scientifically. Ensure that your protocol is as close to the normal, pre-experiment way of doing things as possible in order to minimize variability. An experiment is only as strong as its protocol!
  4. Run your protocol and gather data. Each run of the protocol counts as a trial in your experiment. Take care to follow your protocol to the letter, and record data about how the output of the process changes based off of the state of the variables. Don’t worry about analyzing data yet, just try to stick to the protocol and pay attention to your controls and variables. It’s best to minimize variability by running protocols at the same time of day.
  5. Repeat step 4 as many times as needed. Gather data until you feel as though you have enough trials to make a decision. If you want to be super scientific, do some statistics and determine the sample size you need in order to make a good decision, but for most workplace experiments, this level of application isn’t necessary.
  6. Analyze the data gathered in steps 4-5. Which changes to which variables created the most beneficial changes to your originally stated metrics? Were there any consequences to optimization?
  7. Implement changes to your workflow. This should be quite easy, with data in hand. Be sure to argue for your changes using the data that you gathered scientifically, if necessary. If there’s no boss to convince, then enjoy the fruits of your labor immediately.
  8. Show off your good results! Be sure to keep a record of the way that your workflow was run beforehand, just in case. It also helps to maintain records of how your metrics were performing before your scientific optimizations, so that you can show off the positive differences you effected later on. If your results were negative, don’t sweat– most experiments have negative results. More experimentation might be useful, but know when it’s time to throw in the towel. There isn’t necessarily room to improve every single process, especially if it’s already been through the ringer a few times over the years.

Hopefully this guide was helpful to you; I know that I’ve more or less run this regimen on every workflow and process that I’ve touched throughout my professional life. The core concept is systematically tracking changes to variables. As long as you can keep track of what you’re changing, you can make a causative connection between your changes and the outcome.

If you liked this piece, follow me on Twitter @cryoshon and be sure to subscribe to the email list on the right!

Time Management Tips from the HIV Lab

Growing up, I hadn’t ever imagined that I’d be at high risk of HIV infection for years on end as a result of my chosen profession. I thought that HIV was mostly a problem of Africans, or homosexuals in the US– a problem that was steadfastly irrelevant and completely opaque to my white, straight, middle class American self.

When I was desperately scouring for jobs to apply to after graduating from college, my only thought about working in an HIV lab is that it might be a cool opportunity to help people with HIV. I liked the idea of “doing science”, and I liked the idea of “helping people”. I grew to understand that in the course of my work, HIV would be my problem too: the high level goal of my job was to create a vaccine for HIV, and the only way to get there was by slogging through experiments involving HIV+ blood, stool, cell, and tissue samples every day, for years.

When I was interviewing for the job, they told me informally that the rate of infection at this laboratory was %0.3 per year, meaning that if I worked there for three years, I’d have about a %1 chance of contracting HIV due to my own mistakes. I don’t know if that statistic is true or not (I suspect not), but I ended up working there for three years, and definitely had a few close calls due to carelessness– a problem addressed later in this piece. At the time of my application, I wasn’t even a little bit scared. It wouldn’t be until much later that the full meaning of what I was going through would be clear to me, and the caution would take over– far later than it should have, of course.

Formally, my title at the start was “research technician” (how demeaning this term grew to be!) at the Ragon Institute of MGH, MIT, and Harvard, an academic research laboratory group devoted to the formulation of a vaccine or cure for HIV, an immune system disease that has proven to be increasingly problematic in the developing world.

At this early point, I hadn’t yet understood that HIV was a problem outside of poor and minority communities. Luckily, I joined the Ragon book club, and read a biography called A Song in the Night written by one of our research subjects regarding his fight against HIV. Reading his account of HIV divested me of my delusions, and made me think more about the white, straight, middle class HIV epidemic that was largely hushed up during the early stages of the disease’s spread.

My job at the Ragon Institute (or Ragon for short) was my first “real” job after college, and I experienced a huge amount of personal and professional growth during my time there. In the course of my time at the Ragon, I went from being a lowly “pair of hands” quasi-biorobot to being one of the most experienced technologists in the Institute,  responsible for leading, training, and advising my peers.

The biological sciences are extremely demanding in terms of attention to detail, and immunology is no exception. Each experiment must be designed properly, and executed with caution and precision. In order for experiments to have statistical relevance, they must be repeated many times with slightly different variables, leading to a high volume of work.

The work must be performed in standardized ways, making use of components which have been tested and standardized themselves. These factors quickly create workflows that are extremely time consuming, dangerous, and psychologically demanding, generating stress. A tiny mistake could ruin a week long experiment, wasting time and money. A larger mistake could give you HIV.

This piece will chronicle the distilled professional wisdom from my time at the Ragon Institute, with a special emphasis on time management.

Many of my nuggets of wisdom have been culled from times when I made mistakes, or witnessed others making mistakes, frequently as a result of rushing through an experiment in a stressed out fashion due to fear of reproach and political fallout.I also frequently consulted Extreme Productivity, which is a decent resource for jump-starting your own thinking about improving your work experience.

After a year of working at the Ragon, I realized that I needed a solution to the problem of making easily avoidable mistakes in order to save my sanity. The mistake that prompted this thought occurred when during an experiment, I performed an action that was akin to adding dish soap directly into a fresh cup of coffee that you’re planning to drink. It was a mixup of order, and relatively simple to avoid. I figured that the solution to avoid making the harder to avoid mistakes would become evident if I managed to find a technique for the small ones. I wasn’t wrong– any problem that’s large is a problem that’s waiting to be split up into particle-sized steps which are easy to solve.

First off, I figured I’d decrease the speed at which I worked. This seemed like a pretty basic common sense way of reducing mistakes. Later, I’d reform this idea to fit it into my concepts of stress reduction and time management, but at this early phase, I didn’t quite execute it properly. I pledged to slow down, especially when doing “simple” tasks. I didn’t think about breaking down large tasks into smaller ones, or planning more effectively, or making accommodations for my reduced rate of perfectionist work.

As a result, when I slowed down, I’d quickly have a backlog of work, and trouble making my appointments and reservations to use certain instruments or people’s time. Sure, the work that I produced didn’t have quite as many mistakes– until I began to get stressed about the growing pile of work yet to be done as a result of my slowness. Then, the growing stress would take its toll, causing mistakes on the more complicated manipulations of my experiments.

The missed and late appointments and reservations were also a stressor, causing tension with the other people in line to use the various research apparatuses. Just slowing down without taking anything else into consideration definitely wouldn’t work. With some trial and error, I made a system for improving my work quality.

My system has three main parts, and one main variable. The three parts are time management, stress management, and professional relationships. I’ll be focusing on time management in this post. The variable is perfectionism. I’ll describe the other parts of the system separately in a different piece. Your time management strategy must be consciously calibrated for the job at hand in light of perfectionism. The level of perfectionism that you choose to apply is going to have transformative impacts on the details of your time management, your levels of stress, and your professional interactions.

I’ll explain in more detail how perfectionism fits into each piece as we go, but the main theme is that perfectionism is a sliding scale which has both good and bad consequences. In the HIV lab, I occupied every shade of the perfectionism gradient at one time or another, often unwittingly. As a novice, I had no control over my own level of perfectionism or lack thereof, meaning that simple but inconsequential tasks (slapping labels onto vials) were performed slowly and perfectly, whereas deeply difficult and complex tasks (calibrating the cytometer’s laser voltages to prevent spectral overlap of excited-state flourochromes) were breezed through without care. When I reached mastery, I understood how to regulate my own level of perfectionism to best complete the tasks at hand. I hope to share this ability with you.

The first step in revamping my time management ability was to estimate and then measure the amount of time that it took me to perform various common tasks. I measured how long it took me to prepare my samples for the analyzer machine, and then how long it took me to analyze them, including the physical walking time to transition between the two places I’d need to go. I measured how long it took me to manipulate my samples if I did preparatory work during my otherwise unproductive incubation times. I measured how long it took me to add entries to our sample database. I measured how long it took me to jot neatly into my lab notebook, and, for fun, measured how long it took me to merely scribble unintelligibly into my lab notebook. Attention to detail takes time.

I wrote it all down, and had a nice collection of most of the things that I did and about how long they took me, along with a few variations of those common things and the extra time the variants would take. This is a critical step to time management, and I highly encourage you to do the same: think of things you do frequently, time yourself (even if you think you know how long it takes, get an objective measurement!) and write it down. Once you have several pieces of data for each task you commonly do, you are closer to being ready to making a realistic work schedule for a given day.

Before we get to actually making the schedule, there’s one other thing that I learned which is critical: breaking down tasks into particles and tracking completion of each particle like a tyrant. It’s an old piece of advice, but it actually works. Don’t write a plan and have an item that says “do the laundry” with an estimate of two hours. This is asking for stress, because in order to do the task “laundry” you have formed the idea in your mind that it will take 120 minutes of continuous work, which is not true. Doing the laundry isn’t all one step, either. It’s a common work flow with a few different steps that fits into your larger plans for the day, and comes with transition times between steps which can’t be neglected.

In order to put the concept of “doing the laundry” into your schedule, it should really look more like:

Laundry (estimated time 2H total):

  1. Gather the dirty clothes (2 minutes)
  2. Separate the white clothes from the colored clothes (3 minutes)
  3. Put the dirty clothes in the hamper (1 minute)
  4. Grab the detergent (30 seconds)
  5. Bring the detergent and the hamper downstairs (1 minute)
  6. Put the detergent into the washer (30 seconds)
  7. Put the clothes into the washer (1 minute)
  8. Start the washer (15 seconds)
  9. Wash cycle (35 minutes, could do something else in the meantime)
  10. Remove the clothes from the washer (2 minutes)
  11. Transfer clothes to dryer (2 minutes)
  12. Start dry cycle (15 seconds)
  13. Dry cycle (50 minutes, could do something else in the meantime)
  14. Remove clothes from dryer (2 minutes)
  15. Fold clothes (10 minutes)
  16. Bring the folded clothes and detergent back upstairs (1 minute)
  17. Put away the detergent and the folded clothes (10 minutes)

None of these steps are intimidating whatsoever, and you can adjust the times to be more accurate as you go. You may also notice that there are a few opportunities here to reduce the amount of “hands on” time. If you were to gather the dirty clothes, separate them, and put them downstairs the day before you had to do the laundry, for instance, that’d reduce the amount of time you’d have to be working on the day of. In this case, the prep work wouldn’t make a huge difference in reducing the total amount of time spent on the task, but it certainly would give you more flexibility to fit doing the laundry into a given slot of time, because it would take less time on that day.

Doing the prep work before it was actually needed was a lesson which also greatly improved my ability to multitask. Once you have made lists with particles of things to do for a given task, you can very easily fit your overall schedule together such that while you are doing hands on things for one task, a different task is in its incubation time. This also works for situations in which you hand off your work to someone else, who will later return it back to you. It’s nice to rest sometimes, but this is time that can be used to be productive. If you hand off your work, you can usually make headway on something else in the meantime. If you’ve done your prep work or do your prep work during these times, you’ll find that you can knock down a lot of tasks by just fitting task particles into any open space.

Realizing that turning my tasks into particles allowed me to accomplish more by cutting down my dead time was a huge improvement for my work at the lab. Of course, multi-tasking has consequences: a task with maximum perfectionism applied will be performed alone, so as to allow a full devotion of attention.

Doing something while something else is out of your hands is a basic productivity strategy, and it also implies another good trick for scheduling your particle-sized tasks: leave yourself a margin of error. If you can plan your day to the minute accurately and have no spare time whatsoever, you’re overbooked. In the lab, I always left myself extra loose time in my schedules in order to account for things which might pop up: a coworker asking for help, running out of a reagent and needing to borrow it, a fire drill, coffee with a friend passing by, etc. You need this loose time as an insurance policy against fate, and also for your own sanity. Having extra time to play with often means that you have more time to take more care and more perfection in the tasks that you are doing. Not having extra time means that in the event of anything unexpected, you will be behind schedule, and your tasks can’t be attended to as much as they really should be.

The last major consideration is the amount of perfectionism you are going to invest in each task in your day. I suggest an easy rating scale of 1 to 5, with 5 being tasks that require an extreme amount of care and perfectionism and 1 being tasks which can be breezed through without much fear of a mistake causing a major derailing. Each particle in your list of tasks on your schedule can be rated this way. This way, you can allow yourself to relax a little bit in between focus intensive tasks while also understanding when you are going to need to really put in scrutiny.

Judgments of what amount of perfectionism should be formed based on the ease of the task, the ease of correcting mistakes, and your familiarity with the task. If it’s quick, easy, hard to mess up, and simple to fix, the task is a 1. If it’s extremely involved with irreversible consequences in the event of a mishap, it’s a 5. This system can help to relieve stress or at least channel stress at the correct moments as well. A quick self-reminder that “this task is a 1” or “this task is a 5” helps keep things in context. In lab research, far more things are closer to 5 than to 1.

In summary, time management is absolutely critical, and easily separates effective and productive employees from those who are drowned, stressed, and overwhelmed. It is a common story that giving an extra task to the busiest person results in it getting done the fastest. I think that this story is a result of the superior time management and productivity dispositions that the highest producing people have to have.

To a certain extent, a person that is an effective time manager is a lot like a wood furnace for tasks. There is a finite capacity for how much a wood furnace can burn at any given time, but its response to having wood put into the fire is to burn hotter and more efficiently.

Follow me on twitter @cryoshon to get notified when I write!

How To Write Systematically in 11.5 bites

After a few years of working in biomedical research and a philosophy degree from college, I know a few things about writing and thinking systematically. Unfortunately, I see a lot of people stumbling in their writing when they try to create complex abstract or technical materials– writing is tough, and accurate, succinct, detailed, and logical writing is even harder.

To me, systematic writing is a method of writing which seeks to transmute the complex relationships between raw or parsed data into a coherent, readable narrative that can be effectively understood and analyzed by someone who is generally knowledgeable on the topic, but who didn’t gather or prepare the data. Systematic writing is part of a greater family of writing that includes scientific writing, technical writing, and financial writing, along with other types I probably haven’t even thought of.

While this definition may seem overly abstract, I’d like to point out that most of our received and sent communications are not systematic; a news anchor is not relaying systematically prepared information to the public, even though the reporters have gone through the trouble of parsing raw data (events that happened) into a narrative (what the anchor says). The quantity of technical detail and data referencing in a news report is slim, as news reports are designed for a very wide audience who have little previous context for the event that happened (the data). An email we send to a colleague referencing data or analysis is not necessarily systematic writing, as it’s entirely possible for a certain context to be inferred between two people; systematic writing provides its own context and content explicitly to the audience.

Systematic writing is typically intended for a small, already-savvy audience, and should only offer the minimal viable context. A reader with general knowledge on the topic of the piece should be able to acquaint himself with a systematically written piece in short order, but a layman should not, because establishing the amount of context required for a layman would involve a lot of background information which falls outside of the scope of a particular instance of systematic writing. We don’t want our systematic writing to sprawl, because systematic writing is intensely purposeful and detail-heavy writing, and lots of background information and tangents dilute the factual details we’re trying to communicate.

So, the title promises 11.5 bites describing the process of writing systematically, and without further ado here’s a primer on how to write and think systematically:

  1. Define your goal. What kind of narrative do you want to make, and what data are you planning on using? Who is going to read the report, and how much context will be required?
  2. Put on your white thinking hat.  To use the terminology of the fantastic thought guide Six Thinking Hats, the white thinking hat is purely unbiased and factual thinking used for establishing a common ground among readers. If you’re going to be writing a systematic document which refers to data, you need to make sure that you don’t take any liberties with the data without explicitly qualifying them as speculation or partially supported. No spin!
  3. Assemble your data. You can’t write systematically without having data. Ensure that your data is collated/parsed/charted in a non-deceptive and easy to understand way– the only person you’re trying to inform at this step is yourself, so it behooves you to be honest about the quality of your data and what knowledge we can actually extract in analysis. If there are computations or manipulations required of your data, now is the time to do them.
  4. Determine the limits of what your data can tell you. Soon, we’ll analyze our data, but first, we need to vaccinate ourselves against narrative mistakes. Though it seems simple, it’s easy to slip up and attribute facts to your data that aren’t actually there. Explicitly state the variables which your data depicts (sales, months). Remember that going forward, all of your statements should be in terms of the variables which you outline here. If you’re not talking about information within the purview the data that your variables describe, you’re not being systematic.
  5. Extract verbal information from your data.   Write down simple statements to these effects,  such as, “the data for November showed 42 sales.” If you computed averages or other values in your data assembly step, now is the time to introduce it as a simple phrase. If you expect that handling the data in this way will be confusing, document your process simply and clearly so that your audience will understand. Do not introduce any explanation at this point, merely state what the data say, and, if necessary, state how the data were processed. Remember not to speculate, the point of this step is to establish purely factual statements.
  6. Analyze your data at a basic level. Now that you have a series of simple statements depicting your data in an unbiased way, comparisons between data statements can begin. Are the sales from November higher than the sales from October? Write that comparison down if it’s relevant to your originally stated goal, and make sure to directly reference the values in your new synthesis statements. The point of this step is to explicitly state simple relationships of the data, independent of any narrative.
  7. Analyze your data deeply. Stay focused on your original goal during this step. What questions can your impartial data statements answer explicitly? Implicitly? What trends in your data are noteworthy? What points of data are outliers? Can you explain the outliers? In this step, writing more complex statements is necessary. “The sales data from November (42 sales) are higher than October (30 sales), following the upward trend of the fall season. These data tell us that the fall season is our strongest selling period, despite the high sales in December.” Don’t try to speculate or hypothesize about “why” yet, just tease out the more complex relationships in your data, and write them down in a clear way. As always, reference your data directly in order to build context for your audience and keep them on the same page. Don’t worry about over-analyzing at this point, we’ll prune our findings later.
  8.  Ask Why. Why did we see the data that we saw in our analysis? What are the general principles governing our data? Address each piece of relevant data with this question, and ensure to answer it briefly. The outliers that were previously identified need special attention at this point. Keep explanations of your data concise and factual, though remember that your explanations are not actually within your data set, so you should draw in outside proof to support your explanations if necessary. It’s okay to hypothesize if you don’t know exactly why certain data turned out the way that they did, but be sure to explicitly label speculation.
  9. Build a narrative using your data, analyses, and explanation. Consider your starting goal, and how to marshal the data, analyses, and explanations in order to accomplish that goal. Your narrative should proceed first with the data, then with a simple factual explanation of the data, then with a more complex analysis of the data, and finish off with an explanation of the data if it’s required. The narrative step of systematic writing is where you put all of the pieces together and put it into one attractive package for your audience. Don’t neglect graceful segways between different portions of the data set. The final product of this step can be considered a first draft of your systematic writing effort, and may take the form of a PowerPoint presentation, meeting agenda, technical report, or formal paper.
  10. Anticipate questions and comments from your audience. Look for areas in which your explanation, analysis, or data prompt a response, and plan accordingly. Questions regarding your narrative are typically the easiest to address by clarifying what you’ve already written explaining why your data appears the way it does. Questions regarding your analysis can get a bit technical depending on the audience, and so you should be prepared to refer back to the source data in your responses. Questions regarding the data itself  or the parsing of the data are the most difficult; typically, the outliers will be under the most scrutiny, and their data quality may be called into question. I find that it helps to get out in front of questions regarding outliers, addressing them to your audience before taking questions.
  11. Prune non-critical information. This is the step where most of the data-statements and analysis statements meet their demise. Which analyses, explanations, and narrative elements aren’t strictly serving your original goal? Remove extraneous information to create a hardened product. Ensure that the relevant context and core data analysis remains, and don’t build a misleading narrative by omitting contradictory relevant data.

The final half-step is, of course, crossing the t’s and dotting the i’s for your final draft– and make sure it’s perfect! A missed detail on something not mission-critical will still distract your audience from your data and analysis.

I hope that my readers have a better idea of how to write and perhaps think systematically after reading this piece. I think that many non-technical people struggle with systematic writing because of how data-centric it is; communicating in the style of referencing data and withholding speculation can be quite difficult for people accustomed to relating written concepts intuitively and emotionally.

If you have any questions, leave em’ in the comments and I’ll respond. I know that the 21st century will have the highest demand yet for systematic thinkers and writers, so I’m also considering forming a consultancy in order to help organizations with training their employees and executives to think and communicate in systematic ways, so expect more on topics like this in the future.

As always, follow me on twitter @cryoshon, re-post my articles to social media, and subscribe to the mailing list on the right!

 

Greetings

Hi.

This blog is my newest project, and I intend to be writing here quite a bit. I’ll be writing about a plurality of topics, but my focus is to bring an outsider perspective and unique analysis.

I expect to have a few different kinds of posts:

  • Analysis of and response to recent events
  • Reviews of brain-food genre books
  • Depictions and evaluations of cultural systems and phenomena
  • Proposals of new paradigms

These posts will generally fall into the following niches:

  • Biology/biotechnology
  • Propaganda/public relations
  • Critical thinking
  • Geopolitics
  • Social Justice
  • US Politics
  • Information technology

I may also decide to write a bit about engaging video games, but I don’t intend for it to be a focus unless there’s some absolutely compelling aspect. The seeds for most of the posts that I have planned right now originally come from my comments on HackerNews (my HN profile), as it’s generally receptive to intelligent discussion.

As far as monetization of this blog goes, it’ll be a long road. I’d like to eventually write myself into self-sufficiency, but it seems dubious that I’ll be able to do so using the current monetization plan, which is to use affiliate links and begging. The first step, however, is to build a base of readers by being outrageously creative and unique.

Without further ado, I think it’s time to get started on my first article.