[PMC Weekly Consulting Insight] Monte Carlo simulations and specializing

Modeling the entire specialization process end to end.

Over the weekend I was sitting in a chair, thinking about using Monte Carlo simulation to model the success of specializing.

I know some of you think I'm exaggerating for effect, so pics or it didn't happen:

Back in the Spring of this year, Douglas Hubbard did a free webinar for the Military Operations Research Society, I attended, and I got interested in Monte Carlo simulations. This past weekend I was able to read up on the topic a bit more, and it got me wondering how I could use this approach to model the results of a business deciding to specialize.

To be clear, I don't have a Monte Carlo simulation to show you. But I do have the beginnings of a model -- a work in progress -- that you might find interesting.

There's this professor -- who looks more professorial as any other professor I've ever seen -- named Sam Savage. Big bushy white eyebrows, etc.

He says: "Plans based on average assumptions are wrong on average." Wherever a measurement is uncertain, he advocates replacing averages with probability distributions in order to get a more accurate result when modeling a scenario.

To do this -- and to design a Monte Carlo simulation -- you need to identify your uncertain variables.

That got me thinking: what are the uncertain variables when someone specializes? I've spent a ton of time thinking and hundreds of thousands of words writing about how you make the specialization decision (which is the front end of things), and I've written a lot about what it takes to turn that specialization decision into a robust market position, but I've not really brought all of this together into the kind of model that gets shown on slide decks in business meetings.

And furthermore, I wonder if a model like this might be used to simulate the entire specialization process from the decision to the economic results of that decision. I won't get that far with things in this article, but the basics of the simulation model alone are quite interesting, so I'll focus on that.

I might be doing a poor job of explaining all of this verbally, so let's just cut to the diagram. I'll elaborate below:

There are 4 major variables that I see effecting the outcome of a person or firm deciding to specialize, and 3 of those are composed of minor variables. In Monte Carlo simulation terms, these are our uncertain variables.

Quality of the decision

The Quality of the decision variable is the one I'm generally focused on in my work with clients. It's the first needle I can help them move.

As we work towards a high quality decision, we are looking at lots of constituent variables, and the following three are the most critical:

Risk: Exceeding your risk profile when you decide to specialize causes trouble because you'll flinch when you need to lean in and you'll give up right before a long-fought victory rounds the corner. Exceeding your risk profile sets you up to specialize in ways that are more difficult or risky than you can handle.

Research: You mitigate risk by understanding the market you're going to specialize in. Good research uncovers critical fundamentals of the market, like buying cycle length, how the market prefers to be marketed to, what problems the market understands and prioritizes solving, and so forth.

Clarity (of the message): Once you've decided how to specialize, the way in which you explain that focus to the market starts to matter. It's possible to make a good specialization decision but cock it up with crappy messaging, though generally clarity in the decision making leads to clarity in expressing the decision.

Market conditions

You can't decide how to specialize without understanding the market conditions. I don't mean "the market" broadly in an Adam Smith sense. Rather, I mean the specific niche market you'd be focusing on as a specialist.

In one way, the variable of market conditions is deeply entwined with the quality of your specialization decision. Thus the Research sub-variable described above.

In another way, market conditions are an independent variable that is unrelated to your specialization decision. That's what I'm talking about here.

You can incorporate the best possible information about market conditions into your specialization decision, but after you make the decision and start to take action, market conditions can change, sometimes unexpectedly.

Demand: From 2006 to 2010, demand for construction materials dropped by 25%. About 74 billion fewer dollars were spent on construction materials.

In the 4 year time period from 2015 to 2019, legal cannabis sales increased by 283%. About 8.8 billion more dollars are now spent on legal cannabis compared to 2015.

These were swings in demand that couldn't have easily been predicted, and the same thing can happen with the demand for expertise, though demand for truly world-class expertise tends to be less elastic because that demand comes from a variety of sources, not all of which are effected equally by a downturn.

Competitive Set: Again, good research should reveal the competitive set that exists in any given market, but the competitive set can change after you commit to a specialization, sometimes in unexpected or disruptive ways. Apple does this when they replicate the functionality of a third-party app inside iOS or MacOS. In the services world, this happens too when someone from an open source project's core team goes freelance, or when a platform builds in functionality that used to require services.

Quality of the execution

Specialization is 20% making a good decision and 80% other variables, the biggest of which is your execution. Generally, execution means your post-specialization marketing efforts.

Consistency: After you decide how to specialize, you work to build a reputation, which is the mental residue of many discrete actions over time. Consistency of execution is one way to more effectively build this reputation.

Effort: This variable is less about how hard or intensely you work, and more about how much effort you're able to dedicate to your reputation-building project. I've seen lots of folks start out strong in terms effort and then -- often because life throws them a curve ball -- quickly reduce effort which also undermines consistency. They can recover from this, but it still damages or delays their reputation-building efforts.

Focus: Focus is how narrowly and effectively your marketing is targeted.

Focus, consistency, and effort are related. Focus is like the lens in a magnifying glass and consistency/effort are like the sunshine. Without the lens, the sun warms but isn't concentrated enough to ignite tinder. If the sun is obscured by clouds, the lens focuses whatever light is there, but there isn't enough light to ignite the tinder.

Despite this close relationship between these three variables, it makes sense to separate them out in this model because I have seen situations where focus and effort is high but consistency is not.

Chutzpah

If there's an x-factor in how the whole specialization equation resolves, it's what we might call chutzpah. In some cases, this is confidence. Not always though.

It might be dogged persistence coming from a not-very-confident person. Or creative relationship-building. Or a willingness to -- as Gary Vaynerchuck colorfully says -- eat shit for a while. Or some other personality-driven thing.

Chutzpah is the vitality you put into your execution. It's the spark of life -- unique to you -- that lives inside every part of your moving from generalist to specialized expert.

Because you're in a relationship business, chutzpah carries more weight than you might at first think. That's why I rank it as a major variable in my model rather than a sub-variable underneath one of the other 3 major variables.


There you have it: a rough cut at a model for specialization.

My theory is that if I could assign a probability distributions to each variable and weight the importance of each variable, then I could run a Monte Carlo simulation that informs how likely a given person is to succeed with a potential specialization.

This idea is still kind of a crazy flyer for me, and I'm in over my head with the whole Monte Carlo simulation tool as well. If any of you have done this kind of thing before, I'd love to learn from you!

What about you? How would you model some process or state change that you're involved in with your clients? What are the uncertain variables? 1

-P


Here's what's been happening on my paid Daily Consulting Insights email list:

Subscribe: https://pmc.substack.com


Notes

  1. Yes, if you're an implementor I'm totally trying to trick you into thinking like a consultant. :) Guilty as charged.

[PMC Weekly Consulting Insight] "I didn't want to offend you"

Experiential learning works, but it's harder, so we dislike it.

Quick tophat: If you manage an overseas dev team, would you be willing to spare 2 minutes for a survey? In exchange, I will share the results with you. -> https://forms.gle/QaNfP1SbWAQ9FEJb9


I used to be afraid to ask clients about money. This behavior came from social training that it's impolite to talk about money in some situations. I over-applied that social norm.

Naturally, this paper was of interest to me: "I Didn’t Want to Offend You: The Cost of Avoiding Sensitive Questions" by Einav Hart, Eric VanEpps, Maurice E. Schweitzer

The abstract is a good summary, with my bolding added to emphasize a few key points:


Within a conversation, individuals balance competing concerns, such as the motive to gather information and the motives to avoid discomfort and to create a favorable impression. Across three pilot studies and four experimental studies, we demonstrate that individuals avoid asking sensitive questions, because they fear making others uncomfortable and because of impression management concerns. We demonstrate that this aversion to asking sensitive questions is both costly and misguided. Even when we incentivized participants to ask sensitive questions, participants were reluctant to do so in both face-to-face and computer-mediated chat conversations. Interestingly, rather than accurately anticipating how sensitive questions will influence impression formation, we find that question askers significantly overestimate the interpersonal costs of asking sensitive questions. Across our studies, individuals formed similarly favorable impressions of partners who asked non-sensitive (e.g., “Are you a morning person?”) and sensitive (e.g., “What are your views on abortion?”) questions, despite askers’ reticence to ask sensitive questions.


As I've matured as a person I've moved from avoiding sensitive questions (about money, for example) to leaning into those same sorts of sensitive questions. I now have a bias towards asking sensitive questions. This study confirms my bias.

Let's not over-apply the conclusions of this study to real life. Exporting results from Mechanical Turk and a university behavioral science lab to the real world is a big leap. The study’s environment was simple. Perhaps the real world calls for more nuanced decision making.

Still, it's valuable to be reminded that our social training -- or how our personality is wired -- might hold us back in some business situations.

What do you do when you identify some behavior that's holding you back?

I like experiential learning, otherwise known as the "Just do it" method, pioneered by Wieden+Kennedy and Nike. I'm kidding about the W+K part, but not kidding at all about the experiential learning part.

This other study explores something I've sensed about experiential learning: Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Here's the key part:


Despite active learning being recognized as a superior method of instruction in the classroom, a major recent survey found that most college STEM instructors still choose traditional teaching methods. This article addresses the long-standing question of why students and faculty remain resistant to active learning. Comparing passive lectures with active learning using a randomized experimental approach and identical course materials, we find that students in the active classroom learn more, but they feel like they learn less. We show that this negative correlation is caused in part by the increased cognitive effort required during active learning.


Here's my short version: experiential learning works, but it's harder, so we dislike it.

This study is consistent with my experience: learning-by-doing has outperformed learning-by-reading-other-people's-advice. I see changes in my Expertise Incubator participants that advice alone could never produce. They earn these new capabilities and expertise assets through experiential learning and flat out hard work. By just doing it.

What might be your entry point to experiential learning? A few simple suggestions:

  • If you underprice your services, ask "how much money will this project make for the company?" at the next opportunity. Remember that you might be trained to avoid asking sensitive questions, so your evaluation of "the right opportunity to ask that kind of question" might be screwed up. If so, compensate by just doing it at the earliest possible time (just rip the bandaid off). And then doing it again with another client or prospect. Then for a third time. Don't quit after the first time, which might be an outlier of some sort.

  • If you think you understand something, challenge yourself to teach it. Try to explain it in 900 words or less of writing, or in 15 minutes or less of speaking, ideally in public. The word or time limits are artificial constraints, meant to reveal how deep your understanding really goes.

  • If you're intimidated by something, build a scale model. This is not exactly the "Just do it" approach, but sometimes we really do need a lower stakes way to begin. A scale model could be a pilot project or proof of concept or merely a scaled-back version of the thing you're intimidated to do or build.

Have a great day,

-P

PS: I'll be participating in a writing retreat in Tennessee next week, so there will be no Weekly Consulting Insight issue the week of October 7 - 11.


Here's what's been happening on my paid Daily Consulting Insights email list:

Try it out for free: https://pmc.substack.com/trial2w

Loading more posts…