Transformation to a Data-Driven Organization: Think 2019 Session #2397

[ first published on the IBM Cloud Blog ]

The most successful businesses will equip every layer in the organization with the data needed to identify and drive growth. The search for this self-service data platform and underlying culture has fuelled IBM’s own internal business transformation in Watson Cloud Platform. Unsurprisingly, the path involves both technical and cultural challenges. Technically, you need to ensure that your data shows the full picture in a timely fashion, with quality information in the tool of choice. Culturally, you need to create an accessible platform that allows subject matter experts to perform deep analysis without becoming data scientists.

Learn more at Think 2019

Join us at Think 2019, Session #2397 – “Transformation to a Data-Driven Organization,” to learn how IBM Cloud built a data platform enabling all team members to drive quality and growth. We will discuss technical and cultural challenges that we have faced and the strategies used to tackle them head on.

Register for Think 2019 and sign up to attend our session!

Nic Sauriol
IBM Cloud Platform Development | Analytics

Arn Hyndman
Software Design, User Experience, & Architecture | IBM Hybrid Cloud

Client Insight | IBM Hybrid Cloud

Growth Fabric — Learnings for Repeatable Growth

Ensuring that your growth efforts are repeatable

[ first published on the IBM Cloud Blog ]

If mastering growth isn’t already challenging enough, how do we ensure it’s repeatable? I recently participated in a panel at Amplify 2018 on this topic, so here is a list of five steps (because we’re supposed to do lists right?) to consider. The goal is to ensure that your robust efforts to create or transform yourself and your organization into a growth juggernaut are not lost the moment that something changes. What is the process or set of actions that can be repeated for the next feature, product, market, or business? How do you ensure that the SaaS metrics you have worked so hard to track and improve continue to do so when you stop looking? My perspective is Cloud Solutions biased, but I doubt these notions are limited to this space.

1. Values: Realistic targets

For the past 20 years, I have worked for companies generating massive revenue streams. Christensen defines “values” in his RPV (resources, processes, and values) framework (discussed in many of his articles and books, including this one) in the sense of what a business or organization considers good or bad. An experiment that demonstrates through a robust A/B test that an increase of 50% in the number of visitors that successfully upgrade their account on a product that is generating $10M in annual revenue might lead to an improvement of $3M-$5M over the next 12 months—an awesome achievement. In companies I have worked for, such numbers might be considered noise; not even worth discussing. The challenge is misaligned values. This is further compounded by the fact that many meaningful big wins can only be seen when amassing the collection of small wins. To fix this, ensure that your leadership team understands the roadmap. Have a 10-year plan—a spreadsheet that shows the target goal of your business in practical terms (read: revenue)—and work back a plan that translates that goal down to month-to-month growth targets that factor realistic network growth (growth loop), realistic retention rates, and so on. Realistic targets allow strong performance to surpass them, ensure that teams don’t walk around deflated (you were never going to make a billion dollars in your first year), and ensure that the organization is working to iteratively improve every detail and experience that your clients will touch. Set realistic and relative targets for improvement, such as “improve the conversion rate on this page by 30% by the end of the year.”

2. Teach: Understand growth

If you have set a realistic goal for the team (see Step 1), all they will likely need to deliver the results you hope for is to understand how that data is captured and measured. They may need help learning about your testing tools (e.g., they may need help figuring out how to set up a solid A/B test). And if you have had some success in growing to date, chances are that your team has grown as well. Work to ensure that your team is versed in the measures that your organization is using to drive growth. If you’re a SaaS Solutions business, have all new hires learn the SaaS lexicon and core concepts. Teaching your core values is also critical from the start—celebrating failure is not something that will likely come naturally. In my opinion, having a common language, tool stack, and understanding of key growth measures are probably the most important factors. Once your team is enabled with this knowledge and power, you may find experiments and improvements happening with almost no other involvement. When your new hires in development are having a healthy debate with marketing about LTV/CAC and whether improvements to the docs pages should be their next focus for experimentation versus a change to the product ordering page, chances are you are doing a great job teaching.

3. Meet: Review results

Sustainable growth is achieved when leadership buys into the growth model and sets realistic targets, keeps in tune with results as they roll in, and demonstrates their commitment to daily, weekly, and monthly iterative improvements by actively participating in playbacks or reviews of team progress and achievements. If the leadership moves away from this model, chances are that the team(s) will too. A dashboard is a great way of aligning the teams and leadership, but

only if the dashboard is agreed upon and understood by all participants. I highly recommend not using a tool that allows for highly custom (read: new unique lexicon to your business that no one will align on), but instead leveraging one of the many awesome tools used by companies that have been trailblazing on growth. How you establish the rate rhythm may vary based on your size, organizational maturity, geography, etc., but it is (or should be) the pulse of your organization. My preference would be a bi-weekly deep-dive with a daily offline review of a common dashboard. Each team should have their own child dashboard and the total results rolled into a summary view. Understanding of every chart, data point, and meaning—why the chart exists—must be common and well documented.

4. Celebrate: Focus on learning from failure

Your team thought that removing a field from a sign-up form would improve the conversion rate by 10%. You ran the test, and they were wrong. Maybe that 10% was baked into your revenue models and now you’re worried you’ll miss a target. This can look like failure—you can find a euphemism, but ultimately the team didn’t achieve their target improvement with a change they thought would lead to success. The experiment was a failure. Embrace it. Use the language of experiments and remember that success was achieving a 10% improvement, not reducing the number of fields on a form. Celebrate the learning and work to have teams identify these results early in order to learn and start the next experiment. Concepts like “fail fast” are common in the tech industry today. But ultimately, if failing at all comes at a price, it will be avoided. And because it’s nearly impossible to avoid celebrating success, it’s more important to celebrate failure. There isn’t a singular recipe for growth, and the only way you will likely find your is by experimenting and never shying away from change. This must be part of your team mentality.

5. Exploration: Greatest value will be emergent

Ensure that the tool stack that you use is one that allows all members of your team to explore your data. I am a strong believer in democratizing data, but that is only meaningful if it’s an active democracy—all team members must be able to meaningfully access and manipulate the data. This means that the tools need to be available and as intuitive as can be found on the market. Reports or results must take seconds, not minutes or hours, to generate. Access control policies must be permissive. Sharing of results and data should be the default. Does your tool have Slack integration? Can you change an axis and see the results quickly, or do you fear changing anything because recomputing the chart could take forever? Once you have a new chart, can you share it with anyone or only some people? Robust documentation, rich tooling, and full team enablement can and will lead to unexpected, unplanned results. In my experience, the sum of the emergent results has been greater than those that could be reasonably anticipated or planned. You can reasonably plan that a team of engineers and marketers working to improve SEO and product documentation content would lead to a 7% improvement to conversion ratio on your new product page reducing your overall CAC significantly. But you didn’t expect that a recent wire would be “playing around” in your data and discover a significant click farm operation that has been siphoning your ad budget which when stopped has also led to significant CAC improvements.


Easier said than done. I’ve read some excellent articles from many companies—big and small—that have made incredible progress in these areas. As powerful as these concepts are, there are forces taking teams away from them—getting the right feature set for the next big launch, critical bugs. business setbacks, etc. I am convinced, however, that this path of enablement, empowerment, and growth focus will yield the best results—even if hacking some of these concepts into an existing process may be a necessary iteration.

Blog at

Up ↑