Nathan Brown, Ph.D.

­
Nathan Brown, Ph.D.

About Nathan Brown, Ph.D.

Nathan Brown, TrueBearing co-founder, works primarily with organizations navigating the liabilities and opportunities inherent in times of significant change. As a psychologist, Dr. Brown brings to his work a deep understanding of the personal challenges facing today's leaders, as well as experience with the organizational strategies that nurture professional success. Dr. Brown is passionate about unleashing the power of practical, evidence-based methods in support of personal and organizational missions.
2 01, 2015

Has the time come for Moneyball for government?

By |January 2nd, 2015|Evaluation, Data and Statistics, Evidence-Based Decision Making, Best Practices|0 Comments

Tying funding for social programs to their effectiveness seems like a no-brainer. Sadly, however, genuine evidence-based decision making in policy and budget priority-setting in federal social spending is all too rare. Evaluations are often required in federally-funded social programs; however, the standards of evidence have often been unclear or lacking altogether (Sorry, but using client satisfaction scales as your sole measure of success is a poor way to measure effectiveness!). On top of that, since performance is rarely considered in funding decisions, little incentive exists for programs to change and improve in response to evaluative feedback. An effort to rectify this situation began in the Bush II years— but even then, less than .2 percent of all nonmilitary discretionary programs were held to rigorous evaluation standards. Kind of takes your breath away, doesn’t it? That’s a particularly disturbing figure when you consider that, according to Ron Haskins in the NYT, “75 percent of programs or practices that are intended to help people do better at school or at work have little or no effect.” One way of interpreting this shocking figure is that in the absence of evidence-based decision making, a massive amount of funds are tied up in supporting ineffective programs that could be invested in promising alternatives. […]

20 01, 2015

EBDM On the Fly 2: The secret sauce for successful leaders

By |January 20th, 2015|EBDM On The Fly|0 Comments

Let’s get to the heart of the matter: What are the core things you must do as a leader? Is there a secret sauce that can add flavor to your many roles? Millions of words have been published on this topic. At one time or another, you’ve probably heard that the essential function of leadership is to: […]

3 02, 2015

EBDM On the Fly 3: The 30,000 foot view

By |February 3rd, 2015|EBDM On The Fly|0 Comments

Rooted in the health sciences where it goes by the moniker “evidence-based practice,” in recent years evidence-based decision making (EBDM) has rapidly evolved as a means of improving the quality of decisions, adapting to diverse applications. As an organizational leader who is on the lookout for ways to improve your skills and effectiveness, you may have heard of EBDM. But others may be asking, “What exactly is EBDM, anyway?” As with any rapidly developing practice, definitions differ. A lot, in fact.  But for our purposes, here’s an “on the fly” definition that forms the backbone of this series: Evidence-based decision making (EBDM) is a process for interpreting and organizing information gathered systematically from diverse sources to create a testable model of the relative effectiveness of alternative programs, policies, or practices. Whew. There are a lot of moving parts in there. Having a firm grasp of the meaning of each element is crucial to the practice of EBDM, and this series will unpack them one by one. […]

19 03, 2015

EBDM On the Fly 4: Anatomy of a decision

By |March 19th, 2015|EBDM On The Fly|0 Comments

So far we’ve identified two basic skills for leadership success: Harness information to make consistently good decisions. Learn systematically from the less-than-good decisions. Chances are, at this point some readers are muttering under their breath: “Well, all this talk about decisionmaking being the heart of leadership and the importance of evidence is fine, but it all depends on how you define ‘decision’ and ‘evidence’, right?” —to which I reply, “Exactly. Let’s do that.” What is a decision, anyway? There’s a surprising lack of consensus. Many definitions imply that decisionmaking is a purely cognitive exercise. In this view, a decision is the result of a rigorous mental process carried out by an individual in isolation: “A conclusion or resolution reached after consideration.” Other sources assert that decisions are fundamentally an expression of individual moral character (“firmness of purpose”)—in this view, an authentic decision is more of an unbending personal statement than a nuanced and goal-directed response to circumstances. Do those perspectives on decisionmaking ring true in your experience? Frankly, we think they miss the mark, because most real-world decisions don’t look like that. Most leaders in the trenches simply don’t identify with Rodin’s Thinker—bronze brow furrowed in noble concentration as he wrestles in solitude with deep and thorny principles. Does that sound like your workday? […]

13 01, 2014

Strategic Planning is Dead

By |January 13th, 2014|Evidence-Based Decision Making, Strategic Planning|0 Comments

A recent article in the Stanford Social Innovation Review succinctly makes the case for adaptive strategic planning—that is, a planning process that doesn’t attempt to predict the future, but rather encourages a culture of experimentation, learning, and adaptation. Since at least World War II, the prevailing approach to strategic planning in the business, governmental and nonprofit world has been rooted in traditional military thinking and culture. Based on centuries of hard-won experience, this approach assumes: The past is always the best predictor of the future. Good data is hard to come by, so new information should be greeted with skepticism. Lines of communication are generally unreliable. Therefore a small number of clear directives, not often changed, are essential in order to coordinate the far-flung elements of our operation. […]

18 12, 2013

Best Data Visualizations of 2013

By |December 18th, 2013|Data Visualization|0 Comments

It’s that time of year, when the “Best of” lists are published… so why shouldn’t data visualization get some love, too? We love discovering data visualizations that pack a punch, bringing  obscure or complex topics into vivid clarity, and we’ll bring them to you whenever we see examples that show the power of a good viz. You’ll find several in this list by Gizmodo (not your usual academic journal, we admit!). […]

1 12, 2013

Does Pretty = True?

By |December 1st, 2013|Data Visualization|0 Comments

Infographics--and their high-octane brethren data visualizations-- can seem pretty darned authoritative. Sometimes they look so elegant and compelling that they just short-circuit your logical abilities and trigger your gullibilities. But just because they're pretty doesn't make them true, as this humorous infographic from SystemComic demonstrates in style.

25 11, 2013

Digesting your data

By |November 25th, 2013|Evidence-Based Decision Making, Data Visualization|0 Comments

Before the tryptophan sets in, you’ll want to take a look at this fun data visualization from Powerhouse. Want to eat that extra helping of sweet potatoes? Love that pecan pie? […]

7 02, 2013

Surveying the future of survey techniques

By |February 7th, 2013|Evaluation, Data and Statistics, Data Visualization|0 Comments

I’m still sifting through the aftermath of the Presidential election and the controversial accusations of bias in polling and predictions leveled against various pollsters on all sides of the political spectrum. For data geeks, the question of bias in polling is a source of endless fascination. As someone whose profession involves mostly non-political surveys, however, I zeroed in on a basic methodological question: Regardless of polling firm, what polling method showed the least bias in predicting the election? According to a detailed post by Nate Silver, the answer is clear, and should make anyone who relies on survey or polling data sit up and take notice: All things being equal, online surveys showed 40 percent less bias than live telephone interviewers, and an astonishing 72 percent less bias than automated telephone “robopolls.” […]

5 10, 2012

Data Scientist: The Sexiest Job of the 21st Century

By |October 5th, 2012|Evaluation, Data and Statistics|0 Comments

An article from the Harvard Business Review we can get behind: Data Scientist: The Sexiest Job of the 21st Century