πUsage & Cost Optimization
Overview
Euno's usage and optimization features help you understand how your data resources are actually being used, identify waste, reduce costs, and improve performance across your entire data stack.
Key Question: You have hundreds of thousands of tables, dashboards, and modelsβbut which ones actually matter?
Why Usage Data Matters
The Challenge
Most data teams face these problems:
π€· Uncertainty
"Is anyone actually using this table?"
"Which dashboards can we safely deprecate?"
"Are our expensive transformations even being queried?"
πΈ Hidden Costs
Tables consuming storage that nobody queries
Expensive dashboards with zero viewers
dbt models that take 30 minutes to build but are never used
β° Wasted Time
Maintaining unused resources
Optimizing the wrong things
No visibility into what's actually important
β Questions You Can't Answer
What's our most critical data asset?
Which tables justify their compute costs?
Where should we invest optimization efforts?
The Solution
Usage data answers all of these questions by tracking:
Query patterns - What's being queried and how often?
Cost attribution - Which resources are most expensive?
User engagement - Who's using what and when?
Performance metrics - What's slow and needs optimization?
Adoption trends - Are new dashboards being adopted?
Usage-Driven Optimization Strategies
1. Identify & Deprecate Unused Resources
The Opportunity: 15-30% of data resources typically go unused, yet they consume:
Storage costs
Maintenance time
Mental overhead
Documentation burden
How Euno Helps:
Step 1: Find Candidates
Step 2: Verify Safety
Check downstream dependencies (Impact Analysis)
Verify with owners (automated Workflow notifications)
Confirm no business-critical processes
Step 3: Deprecate
Archive tables instead of dropping
Document deprecation reason
Monitor for unexpected queries
Expected Impact:
Cost reduction
Cleaner, more maintainable catalog
Improved data discovery
2. Optimize High-Cost, High-Usage Resources
The Opportunity: Your most-queried tables are often not optimized, leading to:
Excessive compute costs
Slow query performance
Poor user experience
How Euno Helps:
Step 1: Identify High-Value Targets
Step 2: Analyze Patterns
Query frequency and timing
User base and access patterns
Current storage structure
Step 3: Apply Optimizations
Add clustering keys (Snowflake)
Add partitioning (BigQuery)
Create materialized views
Implement caching strategies
Expected Impact:
Cost reduction on targeted tables
2-10x query performance improvement
Better user experience
3. Convert Views to Materialized Tables
The Opportunity: Complex views that are queried frequently waste compute on every query.
How Euno Helps:
Step 1: Find Expensive Views
Step 2: Calculate ROI
Current compute cost: $1,500/month
Materialization cost: $200/month (storage + refresh)
Potential savings: $1,300/month
Step 3: Implement
Convert view to materialized table
Schedule incremental refreshes
Update downstream dependencies
Expected Impact:
Cost reduction
Consistent query performance
Reduced warehouse contention
4. Eliminate Unused Dashboards
The Opportunity: Dashboards consume:
Scheduled refresh compute
Developer maintenance time
Clutter in BI tools
How Euno Helps:
Step 1: Find Abandoned Dashboards
Step 2: Validate with Owners
Automated Workflow notifications
Give owners 30 days to respond
Document reasons for keeping
Step 3: Deprecate or Archive
Disable scheduled refreshes
Archive workbooks
Document for historical reference
Expected Impact:
Reduced BI tool clutter
Lower compute costs (scheduled refreshes)
Improved user experience (easier to find relevant content)
5. Optimize dbt Build Times
The Opportunity: Slow dbt models delay:
CI/CD pipelines
Data freshness
Developer productivity
How Euno Helps:
Step 1: Find Bottlenecks
Step 2: Analyze Causes
Complex joins
Large data scans
Inefficient SQL
Missing incremental logic
Step 3: Optimize
Implement incremental models
Add filters/limits for development
Optimize SQL logic
Parallelize where possible
Expected Impact:
50-80% reduction in build times
Faster CI/CD pipelines
Improved developer experience
Next Steps
Enable Usage Collection
Configure your integrations
Explore Your Usage Data
Use AI Assistant to query
Sort and filter in UI
Identify quick wins
Implement Your First Optimization
Start with deprecated resource cleanup
Measure impact
Set Up Monitoring
Create usage-based Workflows
Get alerted to issues
Related Documentation
Impact Analysis - Understand change impact
Workflows - Automate usage monitoring
Source Integrations - Connect your data tools
AI Assistant - Query usage data with natural language
Last updated