Snowflake Archives - Unravel https://www.unraveldata.com/resources/snowflake/ Wed, 04 Jun 2025 14:34:00 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.1 Unravel Data Launches Free Snowflake Native App for Cost and Performance Optimization on Snowflake Marketplace https://www.unraveldata.com/resources/unravel-data-launches-free-snowflake-native-app/ https://www.unraveldata.com/resources/unravel-data-launches-free-snowflake-native-app/#respond Tue, 03 Jun 2025 07:31:32 +0000 https://www.unraveldata.com/?p=18299

Snowflake users can achieve actionable recommendations for cost and performance optimization with Unravel Data’s Snowflake Native App.

The post Unravel Data Launches Free Snowflake Native App for Cost and Performance Optimization on Snowflake Marketplace appeared first on Unravel.

]]>

SAN FRANCISCO — June 3, 2025 — Unravel Data today announced at Snowflake’s annual user conference, Snowflake Summit 2025, the launch of its Health Check, an AI-driven cost and performance optimization Snowflake Native App available on Snowflake Marketplace. Joint customers can now complement Snowflake’s built-in cost and performance optimization capabilities with Health Check to get a full analysis on the health of their data ecosystem, with recommendations for production workloads.

Through this collaboration, Unravel Data and Snowflake, the AI Data Cloud company will help joint customers inform business decisions and drive innovation by automatically analyzing Snowflake workloads to provide concrete, actionable recommendations to boost performance. As a Snowflake Native App, customers can install and run the Unravel Health Check directly in their Snowflake account without the need for data movement, accelerating time to value and reducing data silos.

“Enterprises face a significant execution gap in data platform optimization,” said Kunal Agarwal, CEO at Unravel Data. “Our free Snowflake Native App bridges this gap by not only identifying inefficiencies, but also provides concrete actions that are easy to implement.”

Unravel Health Check is capable of analyzing up to 30 days of Snowflake usage to deliver optimization recommendations across the following workloads:

  • Warehouse usage — Rightsizing and consolidation opportunities to reduce costs
  • SQL workload — Query enhancements to address inefficiencies in filters, joins, and projections
  • Storage usage — Identification of “cold” tables and suboptimal feature usage

Using the combined capabilities of the Snowflake Native App Framework and Snowpark Container Services, Unravel Health Check offers sophisticated analysis while data remains within the customer’s Snowflake environment.

“What sets Unravel apart is our focus on actionability,” said Shivnath Babu, CTO at Unravel Data. “We deliver specific, implementable guidance that addresses the root causes of inefficiency.”

Users can install the Unravel Health Check directly from Snowflake Marketplace and see results within minutes, with no security or privacy concerns as data never has to leave their environment. Unravel Health Check is available now on the Snowflake Marketplace at no cost.

Snowflake Marketplace helps companies expand what’s possible with data and AI through third-party data, apps and AI products. With on-platform purchasing and immediate access to data products, Snowflake Marketplace lowers integration costs and streamlines procurement processes. By delivering data, apps and AI products directly to the customers’ data, providers deliver a superior customer experience and see accelerated revenue growth and increased margins. To learn more about Snowflake Marketplace and how to find, try and buy third-party products to accelerate your analytics, app development and AI initiatives, click here

The Snowflake Native App Framework enables developers to build applications using Snowflake’s core functionalities, distribute them globally on Snowflake Marketplace, and deploy them within a customer’s Snowflake account. To learn more about the Snowflake Native App Framework and how to become a Snowflake partner, click here.

About Unravel Data

Unravel Data is a data observability and FinOps platform that provides full-stack visibility, AI-powered recommendations, and actionable automations to help optimize the performance, cost, and reliability of modern data applications. For more information, visit unraveldata.com.

Media Contact:

Unravel Data PR Team
hello@unraveldata.com

The post Unravel Data Launches Free Snowflake Native App for Cost and Performance Optimization on Snowflake Marketplace appeared first on Unravel.

]]>
https://www.unraveldata.com/resources/unravel-data-launches-free-snowflake-native-app/feed/ 0
Mastering Cost Management: From Reactive Spending to Proactive Optimization https://www.unraveldata.com/resources/mastering-cost-management/ https://www.unraveldata.com/resources/mastering-cost-management/#respond Wed, 05 Feb 2025 20:26:50 +0000 https://www.unraveldata.com/?p=17668

According to Forrester, accurately forecasting cloud costs remains a significant challenge for 80% of data management professionals. This struggle often stems from a lack of granular visibility, control over usage, and ability to optimize code and […]

The post Mastering Cost Management: From Reactive Spending to Proactive Optimization appeared first on Unravel.

]]>

According to Forrester, accurately forecasting cloud costs remains a significant challenge for 80% of data management professionals. This struggle often stems from a lack of granular visibility, control over usage, and ability to optimize code and infrastructure for cost and performance. Organizations utilizing modern data platforms like Snowflake, BigQuery, and Databricks often face unexpected budget overruns, missed performance SLAs, and inefficient resource allocation.

Transitioning from reactive spending to proactive optimization is crucial for effective cost management in modern data stack environments.

This shift requires a comprehensive approach that encompasses several key strategies:

1. Granular Visibility
Gain comprehensive insights into expenses by unifying fragmented data and breaking down silos, enabling precise financial planning and resource allocation for effective cost control. This unified approach allows teams to identify hidden cost drivers and inefficiencies across the entire data ecosystem.

By consolidating data from various sources, organizations can create a holistic view of their spending patterns, facilitating more accurate budget forecasting and informed decision-making. Additionally, this level of visibility empowers teams to pinpoint opportunities for optimization, such as underutilized resources or redundant processes, leading to significant cost savings over time.

2. ETL Pipeline Optimization
Design cost-effective pipelines from the outset, implementing resource utilization best practices and ongoing performance monitoring to identify and address inefficiencies. This approach involves carefully architecting ETL processes to minimize resource usage while maintaining optimal performance.

By employing advanced performance tuning techniques, such as optimizing query execution plans and leveraging built-in optimizations, organizations can significantly reduce processing time and associated costs. Continuous monitoring of pipeline performance allows for the early detection of bottlenecks or resource-intensive operations, enabling timely adjustments and ensuring sustained efficiency over time.

3. Intelligent Resource Management
Implement intelligent autoscaling to dynamically adjust resources based on workload demands, optimizing costs in real-time while maintaining performance. Efficiently manage data lake and compute resources to minimize unnecessary expenses during scaling. This approach allows organizations to provision automatically and de-provision resources as needed, ensuring optimal utilization and cost-efficiency.

By setting appropriate scaling policies and thresholds, you can avoid over-provisioning during periods of low demand and ensure sufficient capacity during peak usage times. Additionally, separating storage and compute resources enables more granular control over costs, allowing you to scale each component independently based on specific requirements.

4. FinOps Culture
Foster collaboration between data and finance teams, implementing cost allocation strategies like tagging and chargeback mechanisms to attribute expenses to specific projects or teams accurately. This approach creates a shared responsibility for cloud costs and promotes organizational transparency.

By establishing clear communication channels and regular meetings between technical and financial stakeholders, teams can align their efforts to optimize resource utilization and spending. A robust tagging system also allows for detailed cost breakdowns, enabling more informed decision-making and budget allocation based on actual usage patterns.

5. Advanced Forecasting
Develop sophisticated forecasting techniques and flexible budgeting strategies using historical data and AI-driven analytics to accurately predict future costs and create adaptive budgets that accommodate changing business needs. Organizations can identify trends and seasonal variations that impact costs by analyzing past usage patterns and performance metrics.

This data-driven approach enables more precise resource allocation and helps teams anticipate potential cost spikes, allowing for proactive adjustments to prevent budget overruns. Additionally, implementing AI-powered forecasting models can provide real-time insights and recommendations, enabling continuous optimization of environments as workloads and business requirements evolve.

Mastering these strategies can help you transform your approach to cost management from reactive to proactive, ensuring you maximize the value of your cloud investments while maintaining financial control.

To learn more about implementing these cost management strategies in your modern data environment, join our upcoming webinar series, “Controlling Cloud Costs.” This ten-part series will explore each aspect of effective cost management, providing actionable insights and best practices to gain control over your data platform costs.

Register for Controlling Databricks Cloud Cost webinars.

Register for Controlling Snowflake Cloud Cost webinars.

The post Mastering Cost Management: From Reactive Spending to Proactive Optimization appeared first on Unravel.

]]>
https://www.unraveldata.com/resources/mastering-cost-management/feed/ 0
Snowflake Cost Management https://www.unraveldata.com/resources/snowflake-cost-management/ https://www.unraveldata.com/resources/snowflake-cost-management/#respond Wed, 06 Nov 2024 16:43:11 +0000 https://www.unraveldata.com/?p=16935

Mastering Snowflake Cost Management and FinOps: A Comprehensive Checklist Effective cost management becomes paramount as organizations leverage Snowflake’s powerful cloud data platform for their analytics and data warehousing needs. This comprehensive checklist explores the intricacies of […]

The post Snowflake Cost Management appeared first on Unravel.

]]>

Mastering Snowflake Cost Management and FinOps: A Comprehensive Checklist

Effective cost management becomes paramount as organizations leverage Snowflake’s powerful cloud data platform for their analytics and data warehousing needs. This comprehensive checklist explores the intricacies of cost management and FinOps for Snowflake, delving into strategies to inform, govern, and optimize usage while taking a holistic approach that considers queries, storage, compute resources, and more.

While this checklist is comprehensive and very impactful when implemented fully, it can also be overwhelming to implement with limited staffing and resources. AI-driven insights and automation can solve this problem and are also explored at the bottom of this guide.

Understanding Cost Management for Snowflake

Snowflake’s unique architecture separates compute and storage, offering a flexible pay-as-you-go model. While this provides scalability and performance benefits, it also requires careful management to ensure costs align with business value.

Effective Snowflake cost management is about more than reducing expenses—it’s also about optimizing spend, ensuring efficient resource utilization, and aligning costs with business outcomes. This comprehensive approach falls under the umbrella of FinOps (Financial Operations).

The Holistic Approach: Key Areas to Consider

1. Compute Optimization

Are compute resources allocated efficiently?

Virtual Warehouse Sizing: Right-size your virtual warehouses based on workload requirements.
Auto-suspend and Auto-resume: Leverage Snowflake’s auto-suspend and auto-resume features to minimize idle time.
Query Optimization: Write efficient SQL queries to reduce compute time and costs.
Materialized Views: Use materialized views for frequently accessed or complex query results.
Result Caching: Utilize Snowflake’s result caching to avoid redundant computations.

2. Resource Monitoring and Governance

Are the right policies and governance in place? Proper monitoring and governance are essential for cost management:

Resource Monitors: Set up resource monitors to track and limit credit usage.
Account Usage and Information Schema Views: Utilize these views to gain insights into usage patterns and costs.
Role-Based Access Control (RBAC): Implement RBAC to ensure appropriate resource access and usage.

3. Storage Management

Is storage managed efficiently? While storage is typically a smaller portion of Snowflake costs, it’s still important to manage efficiently:

Data Lifecycle Management: Implement policies for data retention and archiving.
Time Travel and Fail-safe: Optimize usage of Time Travel and Fail-safe features based on your data recovery needs.
Zero-copy Cloning: Leverage zero-copy cloning for testing and development to avoid duplicating storage costs.
Data Compression: Use appropriate compression methods to reduce storage requirements.

4. Data Sharing and Marketplace

Are data sharing and marketplace usage optimized?

Secure Data Sharing: Leverage Snowflake’s secure data sharing to reduce data movement and associated costs.
Marketplace Considerations: Carefully evaluate the costs and benefits of data sets or applications from Snowflake Marketplace.

Implementing FinOps Practices

To master Snowflake cost management, consider these FinOps practices:

1. Visibility and Reporting

Implement comprehensive tagging strategies for resources.
Create custom dashboards using Snowsight or third-party BI tools for cost visualization.
Set up alerts for unusual spending patterns or budget overruns.

2. Optimization

Regularly review and optimize warehouse configurations and query performance.
Implement automated processes to identify and optimize high-cost queries or inefficient warehouses.
Foster a culture of cost awareness among data analysts, engineers, and scientists.

3. Governance

Establish clear policies for warehouse creation, data ingestion, and resource provisioning.
Implement approval workflows for high-cost operations or large-scale data imports.
Create and enforce organizational policies to prevent costly misconfigurations.

Setting Up Guardrails

Implementing guardrails is crucial to prevent unexpected costs:

Resource Monitors: Set up resource monitors with actions (suspend or notify) when thresholds are reached.
Warehouse Size Limits: Establish policies on maximum warehouse sizes for different user groups.
Query Timeouts: Configure appropriate query timeouts to prevent runaway queries.
Data Retention Policies: Implement automated data retention and archiving policies.

The Need for Automated Observability and FinOps Solutions

Given the complexity of modern data operations, automated solutions can significantly enhance cost management efforts. Automated observability and FinOps solutions can provide the following:

Real-time cost visibility across your entire Snowflake environment.
Automated recommendations for query optimization and warehouse right-sizing.
Anomaly detection to quickly identify unusual spending patterns.
Predictive analytics to forecast future costs and resource needs.

These solutions can offer insights that would be difficult or impossible to obtain manually, helping you make data-driven decisions about your Snowflake usage and costs.

Snowflake-Specific Cost Optimization Techniques

Cluster Keys: Properly define cluster keys to improve data clustering and query performance.
Search Optimization: Use search optimization service for tables with frequent point lookup queries.
Multi-cluster Warehouses: Leverage multi-cluster warehouses for concurrency without over-provisioning.
Resource Classes: Utilize resource classes to manage priorities and costs for different workloads.
Snowpipe: Consider Snowpipe for continuous, cost-effective data ingestion.

Conclusion

Effective Snowflake cost management and FinOps require a holistic approach considering all aspects of your data operations. By optimizing compute resources, managing storage efficiently, implementing robust governance, and leveraging Snowflake-specific features, you can ensure that your Snowflake investment delivers maximum value to your organization.

Remember, the goal isn’t just to reduce costs, but to optimize spend and align it with business objectives. With the right strategies and tools in place, you can transform cost management from a challenge into a competitive advantage, enabling your organization to make the most of Snowflake’s powerful capabilities while maintaining control over expenses.

By continuously monitoring, optimizing, and governing your Snowflake usage, you can achieve a balance between performance, flexibility, and cost-efficiency, ultimately driving better business outcomes through data-driven decision-making.

To learn more about how Unravel can help optimize your Snowflake cost, request a health check report, view a self-guided product tour, or request a personalized demo.

The post Snowflake Cost Management appeared first on Unravel.

]]>
https://www.unraveldata.com/resources/snowflake-cost-management/feed/ 0
Snowflake Code Optimization https://www.unraveldata.com/resources/snowflake-code-optimization/ https://www.unraveldata.com/resources/snowflake-code-optimization/#respond Wed, 06 Nov 2024 16:40:25 +0000 https://www.unraveldata.com/?p=16954

The Complexities of Code Optimization in Snowflake: Problems, Challenges, and Solutions In the world of Snowflake data warehousing, code optimization is crucial for managing costs and ensuring efficient resource utilization. However, this process is fraught with […]

The post Snowflake Code Optimization appeared first on Unravel.

]]>

The Complexities of Code Optimization in Snowflake: Problems, Challenges, and Solutions

In the world of Snowflake data warehousing, code optimization is crucial for managing costs and ensuring efficient resource utilization. However, this process is fraught with challenges that can leave even experienced data teams scratching their heads.

This blog post explores the complexities of code optimization in Snowflake, the difficulties in diagnosing and resolving issues, and how automated solutions can simplify this process.

The Snowflake Code Optimization Puzzle

1. Inefficient JOIN Operations

Problem: Large table joins often lead to excessive data shuffling and prolonged query times, significantly increasing credit consumption.

Diagnosis Challenge: Pinpointing the exact cause of a slow JOIN is like finding a needle in a haystack. Is it due to poor join conditions, lack of proper clustering, or simply the volume of data involved? The query plan might show a large data shuffle, but understanding why it’s happening and how to fix it requires deep expertise and time-consuming investigation.

Resolution Difficulty: Optimizing JOINs often involves a trial-and-error process. You might need to experiment with different join types, adjust clustering keys, or even consider restructuring your data model. Each change requires careful testing to ensure it doesn’t negatively impact other queries or downstream processes.

2. Suboptimal Data Clustering

Problem: Poor choices in clustering keys lead to inefficient data access patterns, increasing query times and, consequently, costs.

Diagnosis Challenge: The effects of suboptimal clustering are often subtle and vary depending on query patterns. A clustering key that works well for one set of queries might be terrible for another. Identifying the root cause requires analyzing a wide range of queries over time, a task that’s both time-consuming and complex.

Resolution Difficulty: Changing clustering keys is not a trivial operation. It requires careful planning, as it can temporarily increase storage costs and impact query performance during the re-clustering process. Determining the optimal clustering strategy often requires extensive A/B testing and monitoring.

3. Inefficient Use of UDFs

Problem: While powerful, User-Defined Functions (UDFs) can lead to unexpected performance issues and increased credit consumption if not used correctly.

Diagnosis Challenge: UDFs are often black boxes from a performance perspective. Traditional query profiling tools might show that a UDF is slow, but they can’t peer inside to identify why. This opacity makes it extremely difficult to pinpoint the root cause of UDF-related performance issues.

Resolution Difficulty: Optimizing UDFs often requires rewriting them from scratch, which can be time-consuming and risky. You might need to balance between UDF performance and maintainability, and in some cases, completely rethink your approach to the problem the UDF was solving.

4. Complex, Monolithic Queries

Problem: Large, complex queries can be incredibly difficult to optimize and may not leverage Snowflake’s MPP architecture effectively, leading to increased execution times and costs.

Diagnosis Challenge: Understanding the performance characteristics of a complex query is like solving a multidimensional puzzle. Each part of the query interacts with others in ways that can be hard to predict. Traditional query planners may struggle to provide useful insights for such queries.

Resolution Difficulty: Optimizing complex queries often requires breaking them down into smaller, more manageable parts. This process can be incredibly time-consuming and may require significant refactoring of not just the query, but also the surrounding ETL processes and downstream dependencies.

The Manual Optimization Struggle

Traditionally, addressing these challenges involves a cycle of:

1. Manually sifting through query histories and execution plans
2. Conducting time-consuming A/B tests
3. Carefully monitoring the impact of changes across various workloads
4. Rinse and repeat

This process is not only time-consuming but also prone to human error. It requires deep expertise in Snowflake’s architecture, SQL optimization techniques, and your specific data model. Even then, optimizations that work today might become inefficient as your data volumes and query patterns evolve.

The Power of Automation in Snowflake Optimization

Given the complexities and ongoing nature of these challenges, many organizations are turning to automated solutions to simplify and streamline their Snowflake optimization efforts. Tools like Unravel can help by:

Continuous Monitoring: Automatically tracking query performance, resource utilization, and cost metrics across your entire Snowflake environment.

Intelligent Analysis: Using machine learning algorithms to identify patterns and anomalies that might be missed by manual analysis.

Root Cause Identification: Quickly pinpointing the source of performance issues, whether they’re related to query structure, data distribution, or resource allocation.

Optimization Recommendations: Providing actionable suggestions for query rewrites, clustering key changes, and resource allocation adjustments.

Impact Prediction: Estimating the potential performance and cost impacts of suggested changes before you implement them.

Automated Tuning: In some cases, automatically applying optimizations based on predefined rules and thresholds.

By leveraging such automated solutions, data teams can focus their expertise on higher-value tasks while ensuring their Snowflake environment remains optimized and cost-effective. Instead of spending hours digging through query plans and execution logs, teams can quickly identify and resolve issues, or even prevent them from occurring in the first place.

Conclusion

Code optimization in Snowflake is a complex, ongoing challenge that requires continuous attention and expertise. While the problems are multifaceted and the manual diagnosis and resolution process can be daunting, automated solutions offer a path to simplify and streamline these efforts. By leveraging such tools, organizations can more effectively manage their Snowflake costs, improve query performance, and allow their data teams to focus on delivering value rather than constantly fighting optimization battles.

Remember, whether you’re using manual methods or automated tools, optimization is an ongoing process. As your data volumes grow and query patterns evolve, staying on top of performance and cost management will ensure that your Snowflake implementation continues to deliver the insights your business needs, efficiently and cost-effectively.

To learn more about how Unravel can help optimize your code in Snowflake, request a health check report, view a self-guided product tour, or request a demo.

The post Snowflake Code Optimization appeared first on Unravel.

]]>
https://www.unraveldata.com/resources/snowflake-code-optimization/feed/ 0
Configuration Management in Modern Data Platforms https://www.unraveldata.com/resources/configuration-management-in-modern-data-platforms/ https://www.unraveldata.com/resources/configuration-management-in-modern-data-platforms/#respond Wed, 06 Nov 2024 16:38:22 +0000 https://www.unraveldata.com/?p=16931

Navigating the Maze of Configuration Management in Modern Data Platforms: Problems, Challenges and Solutions In the world of big data, configuration management is often the unsung hero of platform performance and cost-efficiency. Whether you’re working with […]

The post Configuration Management in Modern Data Platforms appeared first on Unravel.

]]>

Navigating the Maze of Configuration Management in Modern Data Platforms: Problems, Challenges and Solutions

In the world of big data, configuration management is often the unsung hero of platform performance and cost-efficiency. Whether you’re working with Snowflake, Databricks, BigQuery, or any other modern data platform, effective configuration management can mean the difference between a sluggish, expensive system and a finely-tuned, cost-effective one.

This blog post explores the complexities of configuration management in data platforms, the challenges in optimizing these settings, and how automated solutions can simplify this critical task.

The Configuration Conundrum

1. Cluster and Warehouse Sizing

Problem: Improper sizing of compute resources (like Databricks clusters or Snowflake warehouses) can lead to either performance bottlenecks or unnecessary costs.

Diagnosis Challenge: Determining the right size for your compute resources is not straightforward. It depends on workload patterns, data volumes, and query complexity, all of which can vary over time. Identifying whether performance issues or high costs are due to improper sizing requires analyzing usage patterns across multiple dimensions.

Resolution Difficulty: Adjusting resource sizes often involves a trial-and-error process. Too small, and you risk poor performance; too large, and you’re wasting money. The impact of changes may not be immediately apparent and can affect different workloads in unexpected ways.

2. Caching and Performance Optimization Settings

Problem: Suboptimal caching strategies and performance settings can lead to repeated computations and slow query performance.

Diagnosis Challenge: The effectiveness of caching and other performance optimizations can be highly dependent on specific workload characteristics. Identifying whether poor performance is due to cache misses, inappropriate caching strategies, or other factors requires deep analysis of query patterns and platform-specific metrics.

Resolution Difficulty: Tuning caching and performance settings often requires a delicate balance. Aggressive caching might improve performance for some queries while causing staleness issues for others. Each adjustment needs to be carefully evaluated across various workload types.

3. Security and Access Control Configurations

Problem: Overly restrictive security settings can hinder legitimate work, while overly permissive ones can create security vulnerabilities.

Diagnosis Challenge: Identifying the root cause of access issues can be complex, especially in platforms with multi-layered security models. Is a performance problem due to a query issue, or is it because of an overly restrictive security policy?

Resolution Difficulty: Adjusting security configurations requires careful consideration of both security requirements and operational needs. Changes need to be thoroughly tested to ensure they don’t inadvertently create security holes or disrupt critical workflows.

4. Cost Control and Resource Governance

Problem: Without proper cost control measures, data platform expenses can quickly spiral out of control.

Diagnosis Challenge: Understanding the cost implications of various platform features and usage patterns is complex. Is a spike in costs due to inefficient queries, improper resource allocation, or simply increased usage?

Resolution Difficulty: Implementing effective cost control measures often involves setting up complex policies and monitoring systems. It requires balancing cost optimization with the need for performance and flexibility, which can be a challenging trade-off to manage.

The Manual Configuration Management Struggle

Traditionally, managing these configurations involves:

1. Continuously monitoring platform usage, performance metrics, and costs
2. Manually adjusting configurations based on observed patterns
3. Conducting extensive testing to ensure changes don’t negatively impact performance or security
4. Constantly staying updated with platform-specific best practices and new features
5. Repeating this process as workloads and requirements evolve

This approach is not only time-consuming but also reactive. By the time an issue is noticed and diagnosed, it may have already impacted performance or inflated costs. Moreover, the complexity of modern data platforms means that the impact of configuration changes can be difficult to predict, leading to a constant cycle of tweaking and re-adjusting.

Embracing Automation in Configuration Management

Given these challenges, many organizations are turning to automated solutions to manage and optimize their data platform configurations. Platforms like Unravel can help by:

Continuous Monitoring: Automatically tracking resource utilization, performance metrics, and costs across all aspects of the data platform.

Intelligent Analysis: Using machine learning to identify patterns and anomalies in platform usage and performance that might indicate configuration issues.

Predictive Optimization: Suggesting configuration changes based on observed usage patterns and predicting their impact before implementation.

Automated Adjustment: In some cases, automatically adjusting configurations within predefined parameters to optimize performance and cost.

Policy Enforcement: Helping to implement and enforce governance policies consistently across the platform.

Cross-Platform Optimization: For organizations using multiple data platforms, providing a unified view and consistent optimization approach across different environments.

By leveraging automated solutions, data teams can shift from a reactive to a proactive configuration management approach. Instead of constantly fighting fires, teams can focus on strategic initiatives while ensuring their data platforms remain optimized, secure, and cost-effective.

Conclusion

Configuration management in modern data platforms is a complex, ongoing challenge that requires continuous attention and expertise. While the problems are multifaceted and the manual management process can be overwhelming, automated solutions offer a path to simplify and streamline these efforts.

By embracing automation in configuration management, organizations can more effectively optimize their data platform performance, enhance security, control costs, and free up their data teams to focus on extracting value from data rather than endlessly tweaking platform settings.

Remember, whether using manual methods or automated tools, effective configuration management is an ongoing process. As your data volumes grow, workloads evolve, and platform features update, staying on top of your configurations will ensure that your data platform continues to meet your business needs efficiently and cost-effectively.

To learn more about how Unravel can help manage and optimize your data platform configurations with Databricks, Snowflake, and BigQuery: request a health check report, view a self-guided product tour, or request a demo.

The post Configuration Management in Modern Data Platforms appeared first on Unravel.

]]>
https://www.unraveldata.com/resources/configuration-management-in-modern-data-platforms/feed/ 0
Discover Your Snowflake Health: Sample Data Estate Report https://www.unraveldata.com/resources/discover-your-snowflake-health-sample-data-estate-report/ https://www.unraveldata.com/resources/discover-your-snowflake-health-sample-data-estate-report/#respond Thu, 09 May 2024 15:40:07 +0000 https://www.unraveldata.com/?p=15347

Snowflake Health: Sample Data Estate Report Download a sample report that includes insights into the health of a Snowflake data estate: Performance insights: See the speedup possible with improved warehouse utilization. Productivity boost: Uncover top operational […]

The post Discover Your Snowflake Health: Sample Data Estate Report appeared first on Unravel.

]]>

Snowflake Health: Sample Data Estate Report

Download a sample report that includes insights into the health of a Snowflake data estate:

  • Performance insights: See the speedup possible with improved warehouse utilization.
  • Productivity boost: Uncover top operational improvements with ease.
  • Savings projection: View projected annualized savings for warehouses and queries.
  • SLA attainment: Measure potential improvements to data pipeline times.
  • Query health: See which queries are failing most frequently and solve these to improve your Snowflake data estate.

The post Discover Your Snowflake Health: Sample Data Estate Report appeared first on Unravel.

]]>
https://www.unraveldata.com/resources/discover-your-snowflake-health-sample-data-estate-report/feed/ 0
Data Observability + FinOps for Snowflake Engineers https://www.unraveldata.com/resources/data-observability-finops-for-snowflake-engineers/ https://www.unraveldata.com/resources/data-observability-finops-for-snowflake-engineers/#respond Fri, 26 Jan 2024 20:22:57 +0000 https://www.unraveldata.com/?p=14632 Abstract light image

AI-DRIVEN DATA OBSERVABILITY + FINOPS FOR SNOWFLAKE DATA ENGINEERS Snowflake data engineers are under enormous pressure to deliver results. This data sheet provides more context about the challenges data engineers face and how Unravel helps them […]

The post Data Observability + FinOps for Snowflake Engineers appeared first on Unravel.

]]>
Abstract light image

AI-DRIVEN DATA OBSERVABILITY + FINOPS FOR SNOWFLAKE DATA ENGINEERS

Snowflake data engineers are under enormous pressure to deliver results. This data sheet provides more context about the challenges data engineers face and how Unravel helps them address these challenges.

Specifically, it discusses:

  • Key Snowflake data engineering roadblocks
  • Unravel’s purpose-built AI for Snowflake
  • Data engineering benefits

With Unravel, Snowflake data engineers can speed data pipeline development and analytics initiatives with granular and real-time cost visibility, predictive, predictive spend forecasting, and performance insights for their data cloud.

To see Unravel Data for Snowflake in action contact: Data experts

The post Data Observability + FinOps for Snowflake Engineers appeared first on Unravel.

]]>
https://www.unraveldata.com/resources/data-observability-finops-for-snowflake-engineers/feed/ 0
Announcing Unravel for Snowflake: Faster Time to Business Value in the Data Cloud https://www.unraveldata.com/resources/announcing-unravel-for-snowflake-faster-time-to-business-value-in-the-data-cloud/ https://www.unraveldata.com/resources/announcing-unravel-for-snowflake-faster-time-to-business-value-in-the-data-cloud/#respond Tue, 14 Nov 2023 12:00:51 +0000 https://www.unraveldata.com/?p=14285

Snowflake’s data cloud has expanded to become a top choice among organizations looking to leverage data and AI—including large language models (LLMs) and other types of generative AI—to deliver innovative new products to end users and […]

The post Announcing Unravel for Snowflake: Faster Time to Business Value in the Data Cloud appeared first on Unravel.

]]>

Snowflake’s data cloud has expanded to become a top choice among organizations looking to leverage data and AIincluding large language models (LLMs) and other types of generative AIto deliver innovative new products to end users and customers. However, the democratization of AI often leads to inefficient usage that results in a cost explosion and decreases the business value of Snowflake. The inefficient usage of Snowflake can occur at various levels. Below are just some examples.

  • Warehouses: A warehouse that is too large or has too many clusters for a given workload will be underutilized and incur a higher cost than necessary, whereas the opposite (the warehouse being too small or having too few clusters) will not do the work fast enough. 
  • Workload: The democratization of AI results in a rapidly increasing number of SQL users, many of whom focus on getting value out of the data and do not think about the performance and the cost aspects of running SQL on Snowflake. This often leads to costly practices such as:
    • SELECT * just to check out the schema
    • Running a long, timed-out query repeatedly without checking why it timed out
    • Expensive joins such as cartesian products
  • Data: No or poorly chosen cluster or partition keys lead to many table scans. Unused tables accumulate over time and by the time users notice the data explosion, they have a hard time knowing which tables may be deleted safely.

Snowflake, like other leading cloud data platforms, provides various compute options and automated features to help with the control of resource usage and spending. However, you need to understand the characteristics of your workload and other KPIs, and have domain expertise, to pick the right options and settings for these featuresnot to mention there’s no native support to identify bad practices and anti-patterns in SQL. Lastly, optimization is not a one-time exercise. As business needs evolve, so do workloads and data; optimizing for cost and performance becomes part of a continued governance of data and AI operations.

Introducing Unravel for Snowflake

Unravel for Snowflake is a purpose-built AI-driven data observability and FinOps solution that enables organizations to get maximum business value from Snowflake by achieving high cost efficiency and performance. It does so by providing deep, correlated visibility into cost, workload, and performance, with AI-powered insights and automated guardrails for continued optimization and governance. Expanding the existing portfolio of purpose-built AI solutions for Databricks, BigQuery, Amazon EMR, and Cloudera, Unravel for Snowflake is the latest data observability and FinOps product from Unravel Data.

The new Unravel for Snowflake features align with the FinOps phases of inform, optimize, operate:

INFORM

  • A “Health Check” that provides a top-down summary view of cost, usage, insights to improve inefficiencies, and projected annualized savings from applying these insights
  • A Cost 360 view that captures complete cost across compute, storage and data transfer, and shows chargeback and trends of cost and usage across warehouses, users, tags, and queries
  • Top K most expensive warehouses, users, queries
  • Detailed warehouse and query views with extensive KPIs
  • Side-by-side comparison of queries

OPTIMIZE

  • Warehouse insights 
  • SQL insights
  • Data and storage insights

OPERATE

  • OpenSearch-based alerts on query duration and credits
  • Alert customization: ability to create custom alerts

Let us first take a look at the Health Check feature that kick-starts the journey of cost and performance optimization.

Health Check for Cost and Performance Inefficiencies

Unravel for Snowflake Health Check

Dashboard-level summary of usage & costs, AI insights into inefficiencies, and projected savings.

The Health Check feature automatically analyzes the workload and cost over the past 15 days. It generates a top-down summary of the cost and usage during this period and, more important, shows insights to improve the inefficiencies for cost and performancealong with the projected annualized savings from applying these insights. 

Unravel for Snowflake Most Expensive Queries report

See at a glance your most expensive “query signatures,” with AI-driven insights on reducing costs.

Users can easily spot the most impactful insights at the warehouse and query levels, and drill down to find out the details. They can also view the top 10 most expensive “query signatures,” or groups of similar queries. Lastly, it recommends alerting policies specific to your workload. 

Users can use the Health Check feature regularly to find new insights and their impact in savings. As the workloads evolve with new business use cases, new inefficiencies may arise and require continued monitoring and governance.

Uncover your Snowflake savings with a free Health Check report
Request report here

Digging Deep into Understanding Spending

Unravel for Snowflake Cost 360 for users & queries

Cost chargeback breakdown and trends by users & queries

Unravel also enables you to easily visualize and understand where money is spent and whether there are anomalies that you should investigate. The Cost 360 view provides cost breakdown and trends across warehouses, users, queries, and tags. It also shows top offenders by listing the most expensive warehouses, users, and query signatures, so that users can address them first.

Unravel for Snowflake Cost 360 for warehouses

Cost chargeback breakdown and trends by warehouses & tags.

Debugging and Optimizing Failed, Costly, and Slow Queries

Unravel for Snowflake insights into query cost & performance

Drill-down AI-driven insights and recommendations into query cost & performance.

Unravel captures extensive data and metadata about cost, workload, and data, and automatically applies AI to generate insights and recommendations for each query and warehouse. Users can filter queries based on status, insights, duration, etc., to find interesting queries, and drill down to look at query details, including the insights for cost and performance optimization. They can also see similar queries to a given query and do side-by-side comparison to spot the difference between two runs.

Get Started with Unravel for Snowflake

To conclude, Unravel supports a variety of use cases in FinOps, from understanding cost and usage, to optimizing inefficiencies and providing alerts for governance. Learn more about Unravel for Snowflake by reading the Unravel for Snowflake docs and request a personalized Snowflake Health Check report. 

 

Role

Scenario

Unravel benefits

FinOps Practitioner

Understand what we pay for Snowflake down to the user/app level in real time, accurately forecast future spend with confidence 

Granular visibility at the warehouse, query, and user level enables FinOps practitioners to perform cost allocation, estimate annual data cloud application costs, cost drivers, break-even, and ROI analysis.

FinOps Practitioner / Engineering / Operations

Identify the most impactful recommendations to optimize overall cost and performance

AI-powered performance and cost optimization recommendations enable FinOps and data teams to rapidly upskill team members, implement cost efficiency SLAs, and optimize Snowflake pricing tier usage to maximize the company’s cloud data ROI.

Engineering Lead / Product Owner

Identify the most impactful recommendations to optimize the cost and performance of a warehouse

AI-driven insights and recommendations enable product and data teams to improve slot utilization, boost SQL query performance, leverage table partitioning and column clustering to achieve cost efficiency SLAs and launch more data queries within the same warehouse budget.

Engineering / Operations

Live monitoring with alerts

Live monitoring with alerts speed mean time to repair (MTTR) and prevent outages before they happen.

Data Engineer

Debugging a query and comparing queries

Automatic troubleshooting guides data teams directly to the pinpoint the source of query failures down to the line of code or SQL query along with AI recommendations to fix it and prevent future issues.

Data Engineer

Identify expensive, inefficient, or failed queries

Proactively improve cost efficiency, performance, and reliability before deploying queries into production. Compare two queries side-by-side to find any metrics that are different between the two runs, even if the queries are different.

The post Announcing Unravel for Snowflake: Faster Time to Business Value in the Data Cloud appeared first on Unravel.

]]>
https://www.unraveldata.com/resources/announcing-unravel-for-snowflake-faster-time-to-business-value-in-the-data-cloud/feed/ 0
Top 4 Challenges to Scaling Snowflake for AI https://www.unraveldata.com/resources/top-4-challenges-to-scaling-snowflake-for-ai/ https://www.unraveldata.com/resources/top-4-challenges-to-scaling-snowflake-for-ai/#respond Tue, 14 Nov 2023 12:00:39 +0000 https://www.unraveldata.com/?p=14227 Computer Network Background Abstract

Organizations are transforming their industries through the power of data analytics and AI. A recent McKinsey survey finds that 75% expect generative AI (GenAI) to “cause significant or disruptive change in the nature of their industry’s […]

The post Top 4 Challenges to Scaling Snowflake for AI appeared first on Unravel.

]]>
Computer Network Background Abstract

Organizations are transforming their industries through the power of data analytics and AI. A recent McKinsey survey finds that 75% expect generative AI (GenAI) to “cause significant or disruptive change in the nature of their industry’s competition in the next three years.” AI enables businesses to launch innovative new products, gain insights into their business, and boost profitability through technologies that help them outperform competitors. Organizations that don’t leverage data and AI risk falling behind.

Despite all the opportunities with data and AI, many find ROI with advanced technologies like IoT, AI, and predictive analytics elusive. For example, companies find it difficult to get accurate and granular reporting on compute and storage for cloud data and analytics workloads. In speaking with enterprise customers, we hear several recurring barriers they face to achieve their desired ROI on the data cloud.

Cloud data spend is challenging to forecast

About 80% of 157 data management professionals express difficulty predicting data-related cloud costs. Data cloud spend can be difficult to reliably predict. Sudden spikes in data volumes, new analytics use cases, and new data products require additional cloud resources. In addition, cloud service providers can unexpectedly increase prices. Soaring prices and usage fluctuations can disrupt financial operations. Organizations frequently lack visibility into cloud data spending to effectively manage their data analytics and AI budgets.

  • Workload fluctuations: Snowflake data processing and storage costs are driven by the amount of compute and storage resources used. As data cloud usage increases for new applications, dashboards, and uses, it becomes challenging to accurately estimate the required data processing and storage costs. This unpredictability can result in budget overruns that affect 60% of infrastructure and operations (I&O) leaders.
  • Unanticipated expenses: Spikes in streaming data volumes, large amounts of unstructured and semi-structured data, and shared warehouse consumption can quickly exceed cloud data budgets. These unforeseen usage peaks can catch organizations off guard, leading to unexpected data cloud costs.
  • Limited visibility: Accurately allocating costs across the company requires detailed visibility into the data cloud bill. Without query-level or user-level reporting, it becomes impossible to accurately attribute costs to various teams and departments. The result is confusion, friction and finger-pointing between groups as leaders blame high chargeback costs on reporting discrepancies.

Organizations can establish spending guardrails and implement controls by implementing a FinOps approach and leveraging granular data to implement smart and effective controls over their data cloud spend, set up budgets, and utilize alerts to avoid data cloud cost overruns.

Data cloud workloads constrained by budget and staff limits

In 2024, IT organizations expect to shift their focus towards controlling costs, improving efficiency, and increasing automation. Cloud service provider price increases and growing usage add to existing economic pressures, while talent remains scarce and expensive. These cost and bandwidth factors are limiting the number of new data cloud workloads that can be launched.

“Data analytics, engineering & storage” are among the top 3 biggest skill gaps and 54% of data teams say the talent shortage and time required to upskill employees are the biggest challenges to adoption of their AI strategy.

Global demand for AI and machine learning professionals is expected to increase by 40% over the next five years. Approximately one million new jobs will be created as companies look to leverage data and AI for a wide variety of use casesfrom automation and risk analysis, to security and supply chain forecasting.

AI adoption and data volume demand

Since ChatGPT broke usage records, generative AI is driving increased data cloud demand and usage. Data teams are struggling to maintain productivity as AI projects scale “due to increasing complexity, inefficient collaboration, and lack of standardized processes and tools” (McKinsey).

Data is foundational for AI and much of it is unstructured, yet IDC found most unstructured data is not leveraged by organizations. A lack of production-ready data pipelines for diverse data sources was the second-most-cited reason (31%) for AI project failure.

Discover your Snowflake savings with a free Unravel Health Check report
Request your report here

Data pipeline failures slow innovation

Data pipelines are becoming more complex, increasing the time required for root cause analysis (RCA) for breaks and delays. Data teams struggle most with data processing speed. Time is a critical factor that pulls skilled and valuable talent into unproductive firefighting. The more time they spend dealing with pipeline issues or failures, the greater the impact on productivity and delivery of new innovation.

Automated data pipeline monitoring and testing is essential for data cloud applications, since teams rapidly iterate and adapt to changing end-user needs and product requirements. Failed queries and data pipelines create data issues for downstream users and workloads such as analytics, BI dashboards, and AI/ML model training. These delays and failures can have a ripple effect that impacts end user decision-making and AI models that rely on accurate, timely content.

Conclusion

Unravel for Snowflake combines the power of AI and automation to help you overcome these challenges. With Unravel, Snowflake users get improved visibility to allocate costs for showback/chargeback, AI-driven recommendations to boost query efficiency, and real-time spend reporting and alerts to accurately predict costs. Unravel for Snowflake helps you optimize your workloads and get more value from your data cloud investments.

Take the next step and check out a self-guided tour or request a free Snowflake Health Check report.

The post Top 4 Challenges to Scaling Snowflake for AI appeared first on Unravel.

]]>
https://www.unraveldata.com/resources/top-4-challenges-to-scaling-snowflake-for-ai/feed/ 0
Unravel Data Launches Cloud Data Cost Optimization for Snowflake https://www.unraveldata.com/resources/unravel-data-launches-cloud-data-cost-optimization-for-snowflake/ https://www.unraveldata.com/resources/unravel-data-launches-cloud-data-cost-optimization-for-snowflake/#respond Tue, 14 Nov 2023 12:00:38 +0000 https://www.unraveldata.com/?p=14340

Efficiency Recommendations for Infrastructure, Configuration, and Code PALO ALTO, CA — November 14, 2023 – Unravel Data, the first AI-enabled data observability and FinOps platform built to address the speed and scale of modern data platforms, […]

The post Unravel Data Launches Cloud Data Cost Optimization for Snowflake appeared first on Unravel.

]]>

Efficiency Recommendations for Infrastructure, Configuration, and Code

PALO ALTO, CA — November 14, 2023 Unravel Data, the first AI-enabled data observability and FinOps platform built to address the speed and scale of modern data platforms, today announced the release of Unravel for Snowflake. By employing AI that is purpose-built for managing the Snowflake technology stack, cloud data cost management is put into the hands of Snowflake customers by providing them with granular insights into specific cost drivers, as well as AI-driven cost and performance recommendations for optimizing SQL queries and data applications. Unravel for Snowflake is the latest data observability and FinOps product from Unravel Data, adding to the portfolio of purpose-built AI solutions that include Databricks, EMR, Cloudera, and BigQuery

Today, companies are looking to AI to provide them with a competitive advantage, which is driving an exponential increase in data usage and workloads, use cases, pipelines, and generative AI/LLM models. In turn, companies are facing even greater problems with broken pipelines and inefficient data processing, slowing time-to-business value and adding to exploding cloud data bills. Unfortunately, most companies lack visibility into their data cloud spend or ways to optimize data pipelines/workloads to lower spend, speed innovation, and mitigate problems. 

Unravel’s purpose-built AI for Snowflake delivers insights based on Unravel’s deep observability at granular levels to deliver AI-driven cost optimization recommendations for warehouses and SQL that include: warehouse provisioning, run-time, auto-scaling efficiencies, and more. With Unravel, Snowflake users can see real-time cost usage by query, user, department, and warehouse, and set customized dashboards, alerts, and guardrails to enable accurate, granular cost allocation, trend visualization, and forecasting.  

“As companies double down on AI efforts, we can expect to see more wasted data cloud spend. Costs are incurred not only with infrastructure but with consumption, as most AI pipelines are created in ways that drive up unnecessary cloud data costs,” said Kunal Agarwal, CEO and co-founder, Unravel Data. “Data engineering and architecture teams need an early warning system to alert them to out-of-control spending, an automated way to pinpoint the source of performance issues and cost overruns, and AI-driven recommendations to optimize code in ways that mitigate unnecessary costs, speed new development, and eliminate data pipeline problems.”

At the core of Unravel Data’s platform is its AI-powered Insights Engine, which has been trained to understand all the intricacies and complexities of modern data platforms and the supporting infrastructure. The Insights Engine has been built to ingest and interpret the continuous millions of ongoing data streams to provide real-time insights into application and system performance, and recommendations to optimize costs, including right-sizing instances and applying code recommendations for performance and financial efficiencies. When combined with Unravel’s automated guardrails and alerts, the Insights Engine enables organizations to achieve data cloud efficiency at scale.  

“Our latest research shows that the adopters of cloud data warehouses struggle with data pipeline complexity, lack of staff/expertise, and an inability to predict workloads,” says Kevin Petrie, VP of Research at The Eckerson Group. “FinOps platforms for cloud data analytics, such as Unravel, provide the granular visibility that stakeholders need to predict and monitor spending. This makes it easier for companies to optimize workloads, change user behavior, and get a handle on governing cloud costs.”

Unravel for Snowflake includes additional features such as:

  • Visibility for cost allocation with chargeback/showback reports 
  • Warehouse-level insights and recommendations relating to warehouse consolidation and underutilization efficiencies
  • Compute + storage unit cost reporting with average cost per project, query, and user over time
  • SQL-related insights and recommendations for optimizing queries by filters, joins, projection inefficiencies, anti-patterns, and more to improve query efficiency and increase capacity so that more users and requests can be served at the same spend
  • Dashboard customization for at-a-glance summaries and drill-down insights for spend, performance, and unit costs
  • Alert customization using OpenSearch-based alerts beyond Snowflake’s out-of-the-box alerts to enable early warnings of resource usage spikes before they hit the cloud bill

To learn more about how we are helping Snowflake customers optimize their data cloud costs and to request a complimentary “Health Check” – projected annual cost savings for your Snowflake warehouses using Unravel’s optimization insights and recommended actions to start saving–  visit Unravel for Snowflake.

About Unravel Data

Unravel Data radically transforms the way businesses understand and optimize the performance and cost of their modern data applications – and the complex data pipelines that power those applications. Unravel’s market-leading data observability and FinOps platform with purpose-built AI for each data platform, provides actionable recommendations needed for cost and performance data and AI pipeline efficiencies. A recent winner of the Best Data Tool & Platform of 2023 as part of the annual SIIA CODiE Awards, some of the world’s most recognized brands like Maersk, Mastercard, Equifax, and Deutsche Bank rely on Unravel Data to unlock data-driven insights and deliver new innovations to market. To learn more, visit https://www.unraveldata.com.

Media Contact
Blair Moreland
ZAG Communications for Unravel Data
unraveldata@zagcommunications.com 

The post Unravel Data Launches Cloud Data Cost Optimization for Snowflake appeared first on Unravel.

]]>
https://www.unraveldata.com/resources/unravel-data-launches-cloud-data-cost-optimization-for-snowflake/feed/ 0
AI-Driven Observability for Snowflake https://www.unraveldata.com/resources/ai-driven-observability-for-snowflake/ https://www.unraveldata.com/resources/ai-driven-observability-for-snowflake/#respond Wed, 21 Jun 2023 19:29:44 +0000 https://www.unraveldata.com/?p=12896 Cache

AI-DRIVEN DATA OBSERVABILITY + FINOPS FOR SNOWFLAKE Performance. Reliability. Cost-effectiveness. Unravel’s automated, AI-powered data observability + FinOps platform for Snowflake and other modern data stacks provides 360° visibility to allocate costs with granular precision, accurately predict […]

The post AI-Driven Observability for Snowflake appeared first on Unravel.

]]>
Cache

AI-DRIVEN DATA OBSERVABILITY + FINOPS FOR SNOWFLAKE

Performance. Reliability. Cost-effectiveness.

Unravel’s automated, AI-powered data observability + FinOps platform for Snowflake and other modern data stacks provides 360° visibility to allocate costs with granular precision, accurately predict spend, run 50% more workloads at the same budget, launch new apps 3X faster, and reliably hit greater than 99% of SLAs.

Unravel Data Observability + FinOps for Snowflake you can:

  • Launch new apps 3X faster: End-to-end observability of data-native applications and pipelines. Automatic improvement of performance, cost efficiency, and reliability.
  • Run 50% more workloads for same budget: Break down spend and forecast accurately. Optimize apps and platforms by eliminating inefficiencies. Set guardrails and automate governance. Unravel’s AI helps you implement observability and FinOps to ensure you achieve efficiency goals.
  • Reduce firefighting time by 99% using AI-enabled troubleshooting: Detect anomalies, drift, skew, missing and incomplete data end-to-end. Integrate with multiple data quality solutions. All in one place.
  • Forecast budget with ⨦ 10% accuracy: Accurately anticipate cloud data spending to for more predictable ROI. Unravel helps you accurately forecast spending with granular cost allocation. Purpose-built AI, at job, user and workgroup levels, enables real-time visibility of ongoing usage.

To see Unravel Data for Snowflake in action contact us today!

The post AI-Driven Observability for Snowflake appeared first on Unravel.

]]>
https://www.unraveldata.com/resources/ai-driven-observability-for-snowflake/feed/ 0