PDF Summary:HBR Guide to Data Analytics Basics for Managers, by Harvard Business Review
Book Summary: Learn the key points in minutes.
Below is a preview of the Shortform book summary of HBR Guide to Data Analytics Basics for Managers by Harvard Business Review. Read the full comprehensive summary at Shortform.
1-Page PDF Summary of HBR Guide to Data Analytics Basics for Managers
In our data-driven age, analysis and interpretation of data are becoming vital skills for managers, not just technical specialists. HBR Guide to Data Analytics Basics for Managers explains how to effectively employ data to enhance decision-making, covering topics like asking focused questions for actionable insights, evaluating data sources for reliability, and utilizing techniques like regression analysis and visualization to uncover meaningful patterns.
This guide from Harvard Business Review also addresses the human side of analytics, exploring mental blindspots that can distort interpretation and sharing strategies to clearly communicate insights. Additionally, it discusses integrating data science teams and hiring analytics professionals with the right blend of technical, business, and storytelling capabilities.
(continued)...
Other Perspectives
- Comparing two versions might not capture the complexity of user preferences, which could be better understood through multivariate testing or other methods.
- A/B testing may not account for time-based variables or trends that could influence the results independently of the variations being tested.
- Resource constraints might necessitate ending an A/B test early, especially for smaller companies or startups with limited budgets and timeframes.
- Some metrics may have interdependencies, and not examining them together could lead to misinterpretation of the results.
- In some cases, retesting may lead to overfitting, where the changes are optimized for the specific sample of data in the tests rather than for general performance.
Evaluate the Trustworthiness of the Dataset
Before conducting a data analysis, it is crucial to address the question: Can this data be trusted?
Assess Data Quality and Completeness Through Independent Checks
Redman emphasizes the importance of independently assessing data reliability. Investigate the data's origin and definition. Ask about the data source's reputation both within and outside your organization. Conduct a "Friday Afternoon Test" to spot obvious errors by thoroughly examining a small sample of datasets.
Practical Tips
- Start a "Data Reliability Journal" where you document predictions made by different sources and track their outcomes over time. This could be as simple as noting weather forecasts, sports predictions, or stock market tips, and then recording whether they were accurate. Over time, you'll develop a sense of which sources are more reliable, based on empirical evidence.
- Use online tools to track the external reputation of data providers by setting up alerts for mentions of these sources in news articles, forums, and industry publications. Tools like Google Alerts or social media monitoring platforms can help you stay informed about any major issues or accolades related to the sources you rely on.
- Use free online tools to simulate a fresh perspective on your work. For instance, if you're working with text, convert your document into a different format, like a PDF or an e-book, and review it on a different device. Changing the medium can help you spot errors that you've become blind to in your usual working format.
Clean and Prepare Data For Integration and Analysis
Redman describes a data-cleaning process with three stages: rinse, wash, and scrub. "Scrub," the most meticulous level, involves a detailed analysis and manual corrections of a small sample. This allows for a thorough understanding of patterns of mistakes and creates a trustworthy subset. The "wash" step automates the "scrub" process to clean the remaining data using algorithms or statistical imputation of missing values. Finally, the "rinse" step involves substituting evident mistakes with "missing value" or making simple corrections. Integrating the cleaned data with current data requires meticulous attention to detail. Redman highlights three key aspects: identifying matching records across collections of information, aligning units and definitions, and removing duplicates.
Other Perspectives
- The rinse stage, as described, might risk introducing bias if 'evident mistakes' are substituted with 'missing value' without a clear rationale or understanding of the data's context.
- Manual corrections, even on a small sample, are subject to human error and may introduce new inaccuracies.
- Automation may not always capture the nuances of data errors that a manual "scrub" might identify, potentially leading to less accurate data cleaning.
- In some cases, marking data as "missing" could affect the results of subsequent analysis, especially if the proportion of missing values becomes significant.
- Overemphasis on meticulousness can lead to analysis paralysis, where the fear of making mistakes prevents timely decision-making or integration.
- Removing duplicates is a complex task that can lead to the loss of valuable information if not done carefully, as what appears to be a duplicate might actually represent a valid variation in the data.
Data Analysis
This section dives deeper into the heart of analyzing data. It discusses applying predictive analytics, understanding and using regression techniques, and navigating the common cognitive pitfalls that can distort decision making.
Understand the Basics of Forecasting Analysis
Predictive analysis helps anticipate future results by using past data, but relies heavily on the accuracy of data, statistical models, and model assumptions.
Importance of Accurate Data, Models, and Assumptions in Predictions
Davenport explains that predictive analytics uses historical data to forecast the future. This approach relies on high-quality data along with appropriate statistical models. However, one must be mindful of the assumptions underlying the frameworks. The most crucial presumption is that future patterns will mirror historical ones. Real-world behaviors are dynamic and may evolve over time, decreasing the reliability of older models.
Context
- The accuracy of predictions heavily depends on the quality of the data used. Clean, relevant, and up-to-date data is crucial for building reliable predictive models.
- Various software and tools, such as Python, R, and machine learning platforms, can aid in building and testing predictive models efficiently.
- It is crucial to test and validate assumptions using statistical tests and diagnostic plots to ensure they hold true for the data being analyzed.
- Innovations can create new patterns and behaviors that historical data cannot predict, requiring updated models and assumptions.
- Increasing global interconnectedness can introduce new variables and complexities that older models may not account for.
Use Key Questions to Evaluate Assumptions in Predictions
Davenport recommends asking several key questions to assess the robustness of predictive analytics. These include questions about the data's origins, representativeness of sample data, presence of outliers, and underlying assumptions of the models. Continuously monitoring for changes in key factors affecting model assumptions is essential to ensure the continued reliability of predictions.
Practical Tips
- Experiment with planting a garden using predictive analytics to optimize growth. Research the best planting times and conditions for your area, and keep a record of weather patterns, plant growth, and yield. Use this data to predict the best future planting strategies, potentially increasing your garden's productivity and your understanding of agricultural analytics.
- Initiate a "What-If" game with friends or family during discussions that involve predictions or planning, where each participant proposes an outlier event or unlikely scenario that could impact the outcome. This game can be both entertaining and enlightening, helping you to consider a wider range of possibilities and prepare for unexpected turns in various aspects of life.
Learn the Fundamentals of Regression Analysis
Using regression, you can grasp how variables are related and create forecasts.
Using Regression Analysis to Identify and Predict Relationships
Redman explains that using regression to analyze data is a widely used method in data analytics that helps determine which factors influence a variable of interest. It involves identifying independent variables that may affect the variable you wish to predict or understand. By plotting the data and creating a "regression line" that optimally aligns with the data points, you can estimate how the variables relate to each other. The residual helps quantify the uncertainty associated with the line of best fit.
Other Perspectives
- Regression analysis typically requires a large sample size to produce reliable results; with small datasets, the analysis may not have enough power to detect true relationships.
- Causality cannot be inferred from regression analysis alone; it can only indicate associations, which may be due to underlying, unmeasured variables.
- Outliers or influential points can disproportionately affect the regression line, leading to misleading estimates of the relationship between variables.
- Quantifying uncertainty through residuals assumes that the errors are normally distributed, which may not always be the case in real-world data.
Distinguishing Between Causal and Correlated Relationships in Regression
Redman cautions that correlation doesn't indicate causation. Observing that variables correlate during a regression study doesn’t necessarily mean that one variable causes the other. It’s crucial to seek the practical mechanisms driving the relationship through further investigation and observation.
Context
- Confounding variables are external factors that can affect both variables being studied, potentially leading to a false assumption of causation. Identifying and controlling for these is crucial in research.
- Correlation measures the strength and direction of a linear relationship between two variables. It is quantified by the correlation coefficient, which ranges from -1 to 1.
Steer Clear of Frequent Cognitive Biases in Decisions Based on Data
Data-driven decisions aren't foolproof. Cognitive biases might sneak into our reasoning and skew our perspectives, leading to flawed interpretations and ill-advised decisions.
Avoid Confirmation Bias, Overconfidence Bias, and Overfitting; Use Mitigation Techniques
MacGarvie and McElheran identify three main mental pitfalls that affect data interpretation. The confirmation bias makes us susceptible to selectively focusing on evidence that confirms pre-existing beliefs. To mitigate this, consider alternative viewpoints, actively seek data that challenges your perspectives, and don’t ignore statistically insignificant findings. The overconfidence bias leads to an overestimation of the accuracy of your predictions and judgments. MacGarvie and McElheran suggest conducting an anticipatory analysis to imagine potential failures, using the “devil’s advocate" method to challenge your assumptions, and comparing your predictions against actual outcomes. Lastly, the overfitting trap arises if a model describes random noise instead of the fundamental pattern. To avoid this, use separate data sets for training and validation, focus on simple analyses based on logical hypotheses, and consider alternative interpretations of your information.
Practical Tips
- Challenge your own beliefs by engaging in a "Belief Audit" where you list out your key beliefs and actively seek out information that contradicts them. This exercise forces you to confront opposing viewpoints and assess the strength of your own convictions. For example, if you believe that a certain diet is the healthiest, deliberately read studies or articles that critique or present alternatives to that diet.
- Create a "second opinion" habit by asking a friend or colleague to assess your predictions before you act on them. Choose someone who tends to be more conservative or skeptical to provide a counterbalance to your natural overconfidence. If you're planning to invest in stocks, for instance, discuss your predictions about market movements with a financially savvy friend who might see risks you've overlooked.
- Use a "Contrarian Buddy" system when facing important decisions. Pair up with a friend or colleague and agree to challenge each other's ideas by taking an opposing viewpoint. This exercise will force you to defend your assumptions and consider different angles, strengthening your devil's advocate method.
- Use a decision journal to track the accuracy of your predictions and decisions over time. By recording the reasons behind your decisions and comparing them with the outcomes, you can identify patterns in your thinking that may be influenced by noise rather than the core issue. For example, if you notice that your investment choices are often swayed by short-term market fluctuations rather than long-term company performance, you might be overfitting your decision-making process to market noise.
- Start a 'simplicity challenge' with friends or colleagues where you tackle complex problems using only basic tools and methods for a set period. This aligns with the focus on simple analyses. For example, if you're trying to improve a process at work, instead of jumping to sophisticated software solutions, first try using a whiteboard and sticky notes to map out the process and identify improvements.
Statistical vs. Practical Implications in Analytical Results
Redman also clarifies the distinction between statistical and practical importance. Statistical significance determines whether a result is merely a coincidence or has a genuine cause. The significance level, often denoted as a p-value, indicates the probability of observing a specific result if the null hypothesis (no effect) is true. In contrast, practical significance pertains to the business relevance and importance of the results. Even if results show statistical significance, they may lack practical meaning in decision making.
Context
- When presenting analytical findings, it’s crucial to convey both statistical and practical significance to stakeholders to ensure informed decision-making.
- The null hypothesis is a default assumption that there is no effect or no difference. Statistical tests aim to challenge this assumption by providing evidence that supports an alternative hypothesis.
- These provide a range of values within which the true effect size is expected to lie, offering more information than a p-value alone.
- It assesses whether the results have sustainable and long-term benefits for the organization, rather than just short-term gains.
- The business environment or specific industry context may render a statistically significant result irrelevant if it does not align with strategic goals or market conditions.
Communicating Data Findings
Once you've evaluated the information, effectively conveying what you've discovered to key players is essential for driving change and securing buy-in for your decisions.
Make Data Meaningful With Successful Storytelling
Data by itself rarely convinces. To convince people, weave what you discover into compelling narratives they can resonate with.
Numbers Lack Voice—Craft a Compelling Story
Morgan argues that data alone cannot affect people's choices. To be persuasive, you must reach the subconscious, which is significantly influenced by emotions. While data provides valuable supporting evidence, it’s essential to incorporate it into emotionally powerful narratives to engage people on a subconscious level. People tend to be swayed more by engaging narratives than by dry, data-filled presentations.
Context
- Human decision-making is often influenced by cognitive biases, which are systematic patterns of deviation from norm or rationality in judgment. These biases can cause people to rely more on stories and emotions than on objective data.
- Neuroscience research shows that emotional responses can activate the brain's reward system, reinforcing certain behaviors and choices.
- Studies show that stories can trigger the release of oxytocin, a hormone associated with empathy and trust, which can enhance the persuasiveness of the message.
- Narratives often incorporate cultural references or familiar scenarios, making them relatable and easier for diverse audiences to grasp and remember.
Structure Presentations to Educate and Persuade, Not Just Impress
Morgan recommends prioritizing emotional impact over simply delivering data. To convince others, tell stories that highlight the potential for development connected to your proposal, making your data an integral part of the narrative, rather than the central focus. This approach will help you connect better with your audience and lead to more favorable outcomes.
Practical Tips
- Practice empathetic listening during Q&A sessions by responding to questions with feelings and experiences rather than just facts. When someone asks a question, first acknowledge the emotion behind it before diving into the data. If someone expresses concern about the figures you presented, you might say, "I understand this can be quite alarming; I felt the same when I first looked at these numbers. Here's what they mean for us..." This approach shows you value the emotional context of the conversation as much as the information being discussed.
- When pitching an idea to colleagues or superiors, frame it within a story that demonstrates its future growth potential. For instance, if you're suggesting a new software tool for your team, narrate a scenario where this tool not only streamlines current operations but also opens up opportunities for innovative projects that could be pursued, painting a picture of long-term benefits and advancement.
- Practice storytelling in everyday conversations to refine your skills. Start by incorporating small stories into your discussions with friends, family, or colleagues. For instance, instead of simply stating a fact, relate it to a short personal experience. If you're talking about the benefits of exercise, you might share a quick tale about how a morning run led to an unexpected friendship.
- Record and analyze your presentations to identify areas for improvement. Use your smartphone or a simple camera setup to record your practice sessions. Watch the playback to observe your body language, use of language, and engagement techniques. Take notes on what works well and what doesn't, and adjust your approach accordingly for future presentations.
Use Visualizations Strategically to Enhance Communication
Visualizing data translates abstract numbers into concrete images, making it easier for the audience to process information and derive insights.
Align Visualizations With Purpose: Confirmation, Education, or Exploration
According to Stikeleather, visual representations of data should serve a clearly defined purpose. Visualizations may serve to confirm existing assumptions, educate the audience about data patterns, or explore relationships within a given data set. It’s crucial to tailor these graphics to their respective purposes and understand the potential hazards of misinterpreting data.
Practical Tips
- Create a visual comparison of your daily activities to optimize time management. Track your activities for a week using a simple spreadsheet, then use a color-coded calendar or heat map to visualize how you spend your time. This can highlight areas where you might be able to combine tasks or find free time for relaxation or hobbies.
- Partner with a peer for a graphic swap critique. Find someone who also creates graphics for various purposes and agree to exchange your work for feedback. The fresh perspective can help you see how well your graphics align with their intended purpose and where there might be room for improvement. This collaborative approach can lead to new insights and ideas for tailoring your visuals more effectively.
- Start a "Two-Sided Debate" with friends or family where you discuss a data-driven topic, with each person presenting opposing interpretations of the same data set. This exercise will force you to consider multiple perspectives and question your assumptions. For instance, if discussing the impact of a new policy, one person could argue the positive effects based on certain data, while another could counter with negative impacts using the same data, highlighting how the same numbers can tell different stories.
Ensure Visualizations Accurately Represent Information and Are Memorable
Duarte emphasizes clarity and accuracy in visualizing data. Avoid cluttered visuals, choose chart types that suit your message, and emphasize key information. To make your data memorable, use visual metaphors that resonate with your audience and bring the scale of the figures to life.
Practical Tips
- You can enhance your presentations by using metaphorical imagery to represent complex data. For instance, if you're trying to show growth, consider using a tree with branches of varying lengths to represent different growth stages or metrics. This visual metaphor can make the concept more relatable and easier to understand for your audience.
- Create a minimalist desktop on your computer or phone. Limit your home screen to essential apps and use folders to group less frequently used ones. This reduces visual clutter every time you unlock your device, making it easier to find what you need and reducing cognitive load.
- You can practice matching data to chart types by creating a visual diary. Start by collecting daily information like your expenses, steps walked, or hours slept. Then, choose a chart type that best represents the trend or message you want to convey for each data set. For example, use a line chart to show your daily step count trend over a month or a pie chart to display the proportion of your expenses.
- Practice the 'one-sentence summary' technique after meetings or conversations by distilling the main idea into a single, clear sentence. After any significant interaction, take a moment to write down the core message or takeaway in one sentence. This exercise will train you to identify and focus on the most relevant information, which can improve your communication and retention skills.
- Practice metaphorical thinking by reinterpreting a mundane object or situation in your daily life as something with deeper meaning. For instance, view your morning coffee routine as a "warm-up session" for your brain, similar to an athlete preparing for a game. This exercise can help you develop the skill to find and use metaphors that connect with others in various scenarios.
- Create a personal blog or social media content series that breaks down large-scale issues into relatable, human-sized stories. If you're passionate about environmental conservation, share stories of individual animals affected by habitat loss or climate change, which can make the vast issue feel more immediate and personal to your audience.
Handle Challenges When Communicating Uncertainty
Forecasts and predictions inevitably involve uncertainty. Accurately conveying unpredictability is crucial for setting realistic expectations and building trust with stakeholders.
Conveying the Ambiguity of Statistics and Likelihood to Non-Technical Audiences
Berinato explains that expressing statistical unpredictability is challenging because visualizations, meant to make data concrete, struggle to visually represent the abstract nature of statistical uncertainty. It’s also difficult for people to grasp probability because unlikely events, even with low probabilities, can still occur. This difficulty becomes apparent when attempting to convey the unpredictability of events like elections or rare occurrences in business, where hindsight bias can make it easy to blame inaccurate models for unexpected outcomes, despite their accurately conveying low probabilities.
Other Perspectives
- Visualizations can be accompanied by narrative explanations or annotations that help to clarify the nature of the uncertainty being represented, thus enhancing the audience's understanding.
- People may not inherently find it hard to understand probability; rather, the way probability is taught or communicated might be ineffective or too abstract for non-technical audiences.
- The use of analogies and relatable examples can make the concept of unpredictability more accessible, indicating that the challenge is surmountable with creative educational approaches.
- The issue may not always be hindsight bias; it could be that the models did not consider a wide enough range of scenarios or lacked robustness.
Communicating Data-Driven Insights: Techniques For Transparency on Limitations and Risks
Berinato emphasizes the importance of depicting uncertainty visually. While some approaches, such as representing ranges with bars or utilizing saturation to illustrate likelihood, exist, the challenge lies in finding effective and universally understood methods. Additionally, it is essential to use clear language when discussing probability, emphasizing the possibility of less likely outcomes even when highlighting more likely ones.
Practical Tips
- Create a "confidence meter" for your next group project to represent the uncertainty of each task's completion. On a scale from low to high confidence, mark each task with a corresponding indicator, such as a thermometer or slider graphic. This tool can facilitate discussions about where more resources or attention is needed, making the project management process more dynamic and responsive to change.
- Develop a simple coding system using colors, symbols, or numbers to indicate uncertainty levels in your personal budget or schedule. For instance, use a color gradient in your calendar from green (certain events) to red (tentative plans), or label expenses in your budget with symbols like a question mark for estimated costs and a checkmark for fixed costs.
Hiring and Working With Data Scientists
This section focuses on understanding the unique skills data scientists have and how to effectively harness their capabilities within organizations.
Unique Skills and Mindset of Analytics Professionals
Data professionals possess a unique blend of analytical, technical, and communication skills that make them valuable assets for navigating the complexities of massive datasets.
Data Professionals Combine Coding, Analytics, Communication, and Savvy Business Skills
Davenport and Patil say data scientists are a blend of a data hacker, an analyst, a communicator, and a trusted adviser. Their core skills involve data collection, analysis, visualization, and communication. They excel at uncovering and interpreting patterns in large and often unstructured data sets. Beyond technical skills, they possess a strong sense of curiosity, the ability to think associatively, and a knack for using data to tell stories.
Other Perspectives
- The emphasis on combining these skills might overlook the importance of teamwork and collaboration, where different team members contribute different expertise.
- Being a "trusted adviser" suggests a level of seniority and experience that not all data scientists may have, especially those who are early in their careers.
- The core skills mentioned do not address the need for continuous learning and adaptability, which are essential given the rapidly evolving nature of data science tools and methodologies.
- The term "excel" is subjective and can lead to unrealistic expectations; some data professionals may be competent but not necessarily outstanding in their ability to uncover and interpret data patterns.
- Curiosity and associative thinking, although important, must be balanced with critical thinking and skepticism to avoid jumping to conclusions or seeing patterns where none exist.
- Not all data professionals are equally skilled in storytelling; some may excel in technical aspects but struggle with narrative construction.
Recruit Data Scientists Beyond Traditional Backgrounds
Davenport and Patil advise managers to look for data scientists beyond traditional backgrounds in business analytics or data management. Often, individuals with backgrounds in fields like physics, ecology, or computer science possess the necessary skills and intellectual curiosity that make them well-suited for this role.
Practical Tips
- Consider enrolling in an online course that teaches the basics of physics, ecology, or computer science to gain a foundational understanding of these disciplines. This knowledge can help you appreciate the skills and perspectives candidates from these fields bring to data science roles. Platforms like Coursera or edX offer courses designed for beginners without a background in these subjects.
Integrate Data Scientists Into the Company
Effectively bringing data scientists into the organization requires a thoughtful approach that fosters innovation and collaboration.
Empower Data Scientists to Innovate With Customer Products and Processes
Davenport and Patil highlight the importance of empowering those working in data science to solve real-world problems and contribute to product development. They suggest encouraging a collaborative approach where product and service teams work closely alongside data scientists, rather than merely generating reports for upper management. Examples of success stories include Jonathan Goldman’s contributions to LinkedIn and George Roumeliotis’s efforts at Intuit.
Practical Tips
- Encourage curiosity about data in your community by organizing a 'data scavenger hunt'. Create a list of questions that can be answered by publicly available data, such as "How many parks are in our city?" or "What was the average home price in our neighborhood last year?" Participants can use their phones or computers to research the answers. This activity promotes the practical use of data to uncover interesting facts and encourages participants to think about how data shapes their understanding of the world around them.
- You can start a blog or social media page dedicated to exploring and discussing the intersection of data science and product development. Share insights, hypothetical product improvements, and case studies that speculate on how data science could enhance various products. This activity will help you understand the practical applications of data science in product development and encourage others to think along the same lines.
- Create a shared digital workspace using tools like Trello or Asana where both data scientists and product teams can post questions, updates, and ideas for feedback. This transparent approach ensures ongoing communication and allows for real-time collaboration. For instance, a data scientist could post preliminary data analysis for the product team to review and provide context, leading to more informed decisions.
- You can enhance your understanding of data science by creating visual summaries of complex reports for personal use. Start by taking a recent report or dataset and use free online tools like Google Charts or Tableau Public to visualize the data. This will help you grasp the importance of presenting data in an accessible way, which is crucial for communicating with non-experts.
- Create a 'Micro-Mentorship' initiative to gain insights from various professionals. Instead of seeking a long-term mentorship, focus on short, focused interactions with multiple mentors. Use social media or professional networking sites to connect with professionals you admire, asking for a single, one-time mentorship session to discuss a specific topic or challenge you're facing. This could be as simple as a 15-minute virtual coffee meeting where you ask targeted questions about their experiences and strategies for success.
- Develop a habit of conducting mini 'data audits' on your personal projects or tasks. For each project, take a moment to list the data you have, what data might be missing, and how you can obtain it. This practice will make you more data-conscious and likely to seek out and utilize relevant data in your personal and professional life.
Foster Collaboration Between Data Scientists and Business Teams
The authors advocate for fostering an environment where data science professionals can connect with others both internally and externally, emphasizing the importance of continuous learning and knowledge sharing. Organizations should promote participation in conferences and workshops and support internal communities of practice. While managing data scientists requires granting them some freedom for exploration and experimentation, leaders should provide clear goals and high expectations to push them towards advanced analytics.
Practical Tips
- Start a virtual book club focused on data science and business integration, inviting members from diverse backgrounds. The goal is to share insights and learn from each other's perspectives. You might read a book on basic data analysis one month and a business strategy book the next, then discuss how the two fields can work together.
- Use a habit-tracking app to set and monitor personal learning goals, ensuring that you allocate time each week to acquire new knowledge. To share what you've learned, you could then volunteer to present at local meetups or write a blog post. This not only holds you accountable for your learning objectives but also provides a platform to disseminate knowledge to others who might benefit from it.
- You can create a social media challenge that encourages sharing conference takeaways. Post your key learnings or an interesting fact from a conference with a specific hashtag and challenge friends or colleagues to do the same. This not only promotes the event but also spreads valuable insights to a wider audience.
- You can foster a culture of shared learning by starting a peer-mentoring program at work. Pair up individuals from different departments to exchange skills and knowledge on a bi-weekly basis. This not only encourages cross-departmental communication but also allows for the organic growth of communities of practice. For example, a marketing professional might share insights on brand building with a product developer, who in turn could provide a fresh perspective on user experience.
- Encourage your team to dedicate a portion of their workweek to personal projects that align with company goals. By allocating a set amount of time, such as 10% of their work hours, team members can explore new data sets, test different algorithms, or investigate emerging trends in data science without the pressure of immediate results. This mirrors the concept of Google's "20% time," which has led to the creation of successful products like Gmail.
- Create a vision board that visually represents your goals for using analytics in a personal project. For instance, if you're interested in nutrition, collect images and data points that reflect your ideal dietary patterns and use them to inspire a structured plan to achieve your nutritional goals, tracking progress with an app or journal.
Additional Materials
Want to learn the rest of HBR Guide to Data Analytics Basics for Managers in 21 minutes?
Unlock the full book summary of HBR Guide to Data Analytics Basics for Managers by signing up for Shortform.
Shortform summaries help you learn 10x faster by:
- Being 100% comprehensive: you learn the most important points in the book
- Cutting out the fluff: you don't spend your time wondering what the author's point is.
- Interactive exercises: apply the book's ideas to your own life with our educators' guidance.
Here's a preview of the rest of Shortform's HBR Guide to Data Analytics Basics for Managers PDF summary:
What Our Readers Say
This is the best summary of HBR Guide to Data Analytics Basics for Managers I've ever read. I learned all the main points in just 20 minutes.
Learn more about our summaries →Why are Shortform Summaries the Best?
We're the most efficient way to learn the most useful ideas from a book.
Cuts Out the Fluff
Ever feel a book rambles on, giving anecdotes that aren't useful? Often get frustrated by an author who doesn't get to the point?
We cut out the fluff, keeping only the most useful examples and ideas. We also re-organize books for clarity, putting the most important principles first, so you can learn faster.
Always Comprehensive
Other summaries give you just a highlight of some of the ideas in a book. We find these too vague to be satisfying.
At Shortform, we want to cover every point worth knowing in the book. Learn nuances, key examples, and critical details on how to apply the ideas.
3 Different Levels of Detail
You want different levels of detail at different times. That's why every book is summarized in three lengths:
1) Paragraph to get the gist
2) 1-page summary, to get the main takeaways
3) Full comprehensive summary and analysis, containing every useful point and example