Explorations into how teams capture, curate and disseminate knowledge about their data.
dbt users fit into two buckets: those that manage their own dbt core deployment, and those that leverage the managed Cloud offering from dbt Labs. While that choice greatly impacts your environment, both options come with pretty bad options for incident alerting. At Workstream.io, we have naturally run into numerous teams that decided to build their own alerting systems. As software builders ourselves, we love speaking with and comparing notes with these folks because we learn so much. Here's what you need to know.
Native dbt alerting fails to meet the needs of most data teams because it doesn’t provide critically needed information, like what test failed, how many tests failed, or the blast radius of an incident. In turn, this quickly causes alert fatigue— even at the lowest incident frequencies. We believe that all teams need to be able to increase failure visibility while also maintaining flexibility as their team and investments in data quality grow and mature. So we’re thrilled to launch advanced alerting capabilities in Slack for all dbt clients—cloud or core—for free.
Nothing sucks more than being the unlucky person on the data team stuck in the on-call rotation managing data incidents for the week. This blog breaks down how data incidents have historically been managed and offers new solutions including automation for data teams utilizing the modern data stack to empower their business partners— so your data team can get off the hamster wheel and back to work.
Sure, data knowledge sounds fascinating, but what the hell is it? What does data knowledge actually mean for your teams and business outcomes?
Workstream Announces data incident automation and a new Monte Carlo integration.
With data assets spread across so many platforms, delivering the right data at the right time can be a huge challenge. Better search and discovery is the starting point for changing this dynamic.
By using your existing data observability tools to surface status pages to end users, you can keep them informed and create a culture of trust around your data.
Join our CEO Nick Freund as he speaks with data leaders Benn Stancil and Danielle Mendheim about the broken process - and the solutions - sitting between data teams and the teams they support
Knowledge asymmetries between data consumers and data builders are a given. Documentation can help close the gap and improve your data ROI.
Workstream is often compared to a data catalog, but while this comparison makes sense on the surface, they are really more different than they are similar.
Join us as we discuss how to counteract the fragmented knowledge around your data, and bring forward a data-driven culture.
With contextual markers like lifecycle and certification, end users know how, and whether, to use your data assets.
Without treating your data consumers like customers, the data products you build may collect dust rather than enabling your business.
By enabling consolidated libraries of assets, collections bring your most important assets into one place, and populate them automatically.
Join us as we discuss the impact of analytics entropy with some of the top data leaders in the space.
We discussed last week how the service desk most data teams employ is broken. This week, we explore how a data concierge can help eliminate these challenges.
Service desks for data teams are too often clunky, transactional, and fail to contribute value to the organization. In this newsletter, we explore how these experience got so bad.
With all of the layoffs happening across the tech industry, we wanted to create a resource to help data professionals get back on their feet.
Analytics teams need to understand how the products they create are being used. So why are analytics about them so often an afterthought?
In this article, we discuss how data-driven teams can tame entropy and empower stakeholders with a single access plane for analytics.
Entropy is unavoidable in data asset management, but there are methods—and new tools—that can help create order out of the chaos.
Data asset entropy has been an issue for as long as data has been used in decision making. But in the modern organization, where analytics tools have exploded in number and type, entropy has become a huge stumbling block to effective data-driven operations.
Your company data is priceless only if you curate, document, and manage it. Options exist for a centralized analytics hub. But, it can be hard to know which one is right for your organization.
The goal of your workflow is not just to uncover valuable wisdom from your data. It is crucial to communicate those actionable data observations to the organization which can use them to make decisions.
We are delighted to announce that Workstream is officially entering its public beta phase, after securing $7 million in seed funding, led by Lerer Hippeau.
While analysts leverage more powerful analytical tools than ever, the feedback mechanisms we use to navigate our projects are outdated.
Workstream will integrate with existing analytical tools, allowing teams to collaborate seamlessly, and orchestrate their projects from start to finish.
Receive regular updates about Workstream, and on our research into the past, present and future of how teams make decisions.