You open Google Ads and see 150 conversions at $45 CPA. Not bad. Then Meta Ads Manager shows 89 conversions at $52 CPA. LinkedIn reports 34 at $88 CPA. Add them up – 273 conversions total.
Your CRM shows 180 actual customers.
The math doesn’t work because each platform uses last-click attribution and counts the same person three times. This isn’t a tracking bug – it’s what happens when analytics tools don’t talk to each other. You’re making budget decisions based on inflated numbers, and 47% of marketing spend gets wasted this way.
What Cross-Platform Analytics Actually Means
Cross-platform analytics unifies data from multiple sources – web, mobile, ads, CRM – into one dashboard that tracks the same user across all touchpoints. Instead of three separate conversion counts, you see one customer journey.
The challenge is that every platform speaks a different language. Google Ads tracks “signup_completed,” your mobile app logs “user_registration_success,” and Meta counts “CompleteRegistration.” Without standardized event naming, you can’t compare anything accurately.
Here’s what breaks: attribution windows overlap (Meta uses 28 days, Google uses 30, LinkedIn uses 90), view-through conversions don’t sync (someone sees a Facebook ad but buys through Google search 3 days later), and cross-device tracking fails (mobile browse, desktop purchase shows as two users). According to research cited by Dataslayer, 15-30% of attribution stays fuzzy even with proper setup.
Power BI with Copilot: Microsoft’s Enterprise Bet
Power BI integrates natively with Microsoft 365, Azure, and Dynamics 365. If your company already runs on Microsoft infrastructure, this is the path of least resistance. Copilot generates DAX formulas, creates report summaries, and answers natural language questions about your data.
The catch: Copilot isn’t actually available to “all users” despite what Microsoft’s marketing says. You need Fabric F64 capacity minimum – that’s $18,500+ per year before you even factor in per-user licenses. Community forums from May 2025 confirm Premium Per User (PPU) licenses don’t grant Copilot access; you must provision F64+ capacity at the tenant level.
Pricing works on Capacity Units (CUs). According to Microsoft’s official announcement, Copilot consumes 400 CU seconds per 1,000 input tokens and 1,200 CU seconds per 1,000 output tokens. For a team of 20 analysts asking 50 questions daily, costs add up fast – one detailed report generation can burn through hours of capacity.
Pro tip: Power BI Desktop is free for local development. Build and test dashboards there before committing to Fabric capacity. Many teams prototype entirely offline, then publish only production-ready reports to avoid burning CUs on iteration.
Cross-platform capability depends on connectors. Power BI connects to 500+ data sources, but advertising platforms (Meta, TikTok, LinkedIn) require third-party connectors that typically cost $69-149/month each. The “unified dashboard” promise assumes you’ve already solved data integration – Power BI won’t do that part for you.
Tableau Pulse: AI That Pushes Insights to You
Tableau flipped the script with Pulse. Instead of logging into dashboards to check metrics, Pulse delivers AI-generated summaries via Slack, email, or mobile app. It detects anomalies, explains trends in natural language, and suggests follow-up questions before you ask.
Pulse is included free with all Tableau Cloud editions (Creator at $75/user/month for Enterprise tier per Julius AI’s pricing analysis). But premium features – Enhanced Q&A, Dynamic Sorting & Grouping, Metric Goals – require Tableau+, a subscription tier that isn’t prominently disclosed on the main pricing page.
Actually, Tableau Pulse shines when dealing with metrics that change frequently. A marketing director follows campaign ROAS, conversion rate, and CAC. When ROAS drops 15%, Pulse doesn’t just alert – it analyzes which segment drove the decline (mobile traffic from paid search) and correlates it with other metrics (increased click volume but lower conversion quality).
| Feature | Tableau Cloud (Free Pulse) | Tableau+ (Premium Pulse) |
|---|---|---|
| AI metric summaries | ✓ | ✓ |
| Slack/email delivery | ✓ | ✓ |
| Enhanced Q&A (multi-metric) | ✗ | ✓ |
| Dynamic sorting/grouping | ✗ | ✓ |
| Metric goal tracking | ✗ | ✓ |
For cross-platform analytics, Tableau connects to Snowflake, BigQuery, Databricks, and most data warehouses. The assumption: your data engineering team has already unified raw sources. Tableau visualizes clean data – it won’t fix messy, fragmented inputs.
ThoughtSpot: Natural Language Search at Scale
ThoughtSpot markets itself as “Google for your data.” Type “which products had highest margin this quarter” and get an answer in seconds. No SQL, no chart configuration – just conversational queries.
Pricing changed in late 2025. The Essentials plan now starts at $25/user/month (5-50 users, 25M data rows). Pro plan switched to $0.10 per query with a 10M AI query cap. Here’s the gotcha: every time you type a question, that’s a query – even if the AI suggests a clarification prompt, it counts. Teams hit the 10M limit faster than expected; according to ThoughtSpot’s official pricing, active users can burn through quotas in weeks.
Enterprise tier (custom pricing, typically $2,500+/month) removes query limits but adds another complexity: unlimited users means you can’t predict costs based on headcount. If 200 employees suddenly start querying dashboards, your bill scales with usage, not seats.
Cross-platform strength: ThoughtSpot’s Spotter AI Agent automatically surfaces trends and anomalies you didn’t ask about. A retail dashboard might proactively flag that weekend sales dropped 12% across all channels, correlate it with a spike in cart abandonment on mobile, and suggest investigating checkout flow. This goes beyond answering questions – it anticipates what you should be asking.
When Embedded Analytics Makes More Sense
Some businesses don’t need internal dashboards – they need to offer analytics to customers. SaaS platforms, agencies, and ecommerce tools embed dashboards into their products so clients see their own data.
Polymer AI specializes in embedded analytics. API access starts at $500/month according to their official pricing page. You get white-label dashboards (your brand, your domain), AI-generated visualizations, and conversational queries. The platform analyzes spreadsheets and APIs to surface trends without manual chart configuration.
Use case: A marketing agency manages 50 client accounts across Google Ads, Meta, Shopify, and email platforms. Instead of building 50 custom dashboards, Polymer generates them automatically from connected data sources. Clients log in, ask “why did my ROAS drop last week,” and get AI explanations pointing to specific campaigns or audience segments.
Onvo AI offers similar embedded capabilities but adds 30+ language support and flexible LLM selection – you can use OpenAI, Gemini, Anthropic, or even self-hosted models. This matters for companies with data residency requirements or those wanting to avoid vendor lock-in. White-label embedding includes custom branding, mobile responsiveness, and PDF/Excel export.
What About Open-Source Options?
Metabase is the most popular open-source BI tool. The free self-hosted version supports unlimited users and dashboards. You write SQL to query data, then build charts and combine them into dashboards. It lacks native AI features but integrates well with external LLMs – connect OpenAI or Anthropic APIs to generate SQL from natural language descriptions.
According to Metabase’s official site, Pro tier adds advanced permissions and embedding, while Enterprise includes SSO, multi-instance management, and premium support. Hosting costs depend on your infrastructure – AWS or Google Cloud expenses vary based on query volume and data size.
Limitation: Metabase doesn’t solve cross-platform data integration. You still need to pipe data from Google Ads, Meta, Shopify, and your CRM into a unified warehouse (Snowflake, BigQuery, Postgres). Metabase queries that warehouse – it doesn’t connect directly to advertising platforms.
The Attribution Problem No Dashboard Solves Automatically
Here’s the part most tutorials skip: even with a unified dashboard, attribution stays broken unless you actively fix it.
Scenario: A customer sees your TikTok ad (doesn’t click), searches your brand on Google the next day (clicks paid search ad), receives a retargeting email (clicks), and finally converts via organic search 3 days later. TikTok counts a view-through conversion. Google Ads claims a click conversion. Email platform records a click-to-purchase. Google Analytics attributes it to organic search (last click).
You’ve now “generated” 4 conversions from 1 customer.
Most analytics dashboards display each platform’s self-reported numbers without reconciling conflicts. They show four different versions of reality and expect you to figure out which one is true. Research from LayerFive confirms this: teams spend 40% of their time on reporting and analysis instead of optimization, and 68% of marketing leaders have “more data than ever” but feel “less confident in decisions.”
- Last-click attribution: Gives 100% credit to the final touchpoint (usually organic search or direct traffic). Undervalues top-of-funnel channels like display ads or social media.
- First-click attribution: Credits the channel that introduced the customer. Ignores everything that happened between awareness and purchase.
- Linear attribution: Splits credit equally across all touchpoints. Treats a 5-second TikTok view the same as a 20-minute product demo.
- Data-driven attribution: Uses machine learning to weigh touchpoints based on actual conversion patterns. Requires 15,000+ conversions over 90 days to find reliable patterns (now GA4’s default model).
The dashboard can’t decide which model to use. You have to choose based on your typical customer journey and sales cycle. A B2B company with 90-day sales cycles needs different attribution than an ecommerce store with same-day purchases.
Hidden Costs Nobody Mentions
Pricing pages show per-user subscriptions, but real costs include:
Data connector fees: Power BI and Tableau connect natively to databases and warehouses, but advertising platforms (Meta, TikTok, LinkedIn) require third-party connectors at $69-149/month each. Connecting 5 ad platforms adds $400-750/month.
Capacity overages: Power BI’s Fabric capacity bills by Capacity Units (CUs). Heavy dashboard use during month-end reporting can spike costs 2-3x your baseline. According to community reports, teams often pause capacity during slow periods to control spend – but dashboards go offline when capacity is stopped.
Query limits: ThoughtSpot’s Pro plan caps at 10 million AI queries. That sounds like a lot until 50 users each ask 4,000 questions per month (200 queries per user per day). Active teams hit limits within weeks and face overage charges or forced upgrades to Enterprise.
Data warehouse costs: Cross-platform dashboards query unified data, not raw sources. You need Snowflake, BigQuery, or Redshift to consolidate data first. Those services bill by storage and compute – costs scale with query frequency and data volume.
Which Tool Makes Sense for Your Use Case
Pick based on what you’re actually trying to do:
- Internal BI for Microsoft shops: Power BI with Copilot if you have F64 capacity budget and already use Azure/Office 365. Deep integration saves setup time.
- Proactive insights for executives: Tableau Pulse if leadership wants AI summaries delivered to Slack/email instead of logging into dashboards. Premium features require Tableau+ but basic Pulse is free with Cloud.
- Self-service analytics for non-technical teams: ThoughtSpot if users need to ask questions conversationally. Watch query limits on Pro plan – Enterprise makes sense for teams with 100+ active users.
- Embedded client-facing analytics: Polymer or Onvo AI if you’re building dashboards into your product for customers. API-first architecture, white-label branding, no per-seat pricing.
- Budget-conscious or open-source preference: Metabase if you can self-host and have SQL skills. Free tier is powerful but requires data engineering to unify sources first.
None of these tools automatically fix attribution. They visualize data – you decide how to interpret overlapping conversions, conflicting metrics, and cross-device gaps.
What Breaks at Scale
Small teams (5-10 users) rarely hit limits. Problems surface when you cross 50 users or 1M+ data rows:
Dashboard load times: Queries that ran in 2 seconds at 100K rows take 30 seconds at 5M rows. Users stop checking dashboards when they’re too slow. Fix: pre-aggregate data, use caching, or upgrade to premium compute tiers (which costs more).
Conflicting definitions: Marketing defines “conversion” as form submissions, sales counts only closed deals, finance measures revenue. One dashboard shows 500 conversions, another shows 180. Everyone argues about whose numbers are right. Fix: document metric definitions, enforce consistent event naming across all platforms.
Stale data: Most dashboards refresh hourly or daily. A campaign fails at 9am, but the dashboard shows yesterday’s data until noon. By then, you’ve wasted half a day’s budget. Fix: real-time streaming (expensive) or accept 1-2 hour lag for batch jobs (cheaper).
Frequently Asked Questions
Can Power BI Copilot really work without F64 capacity?
No. Despite Microsoft’s broad marketing, Copilot requires Fabric F64 capacity minimum ($18,500+/year) or a Fabric Copilot Capacity (FCC) allocation. Premium Per User (PPU) licenses don’t grant access. Community forums confirm this was a major surprise for teams who invested setup time before realizing the actual cost.
How do I prevent platforms from counting the same conversion multiple times?
Implement server-side tracking with a unified customer ID that follows users across platforms. Tools like Cometly or Triple Whale use first-party pixels that bypass browser restrictions and reconcile conversions across Google Ads, Meta, and other channels. You define attribution rules (last-click, linear, data-driven) in one place instead of trusting each platform’s self-reported numbers. Accept that 15-30% of attribution will remain fuzzy due to cross-device gaps and view-through conversions that can’t be perfectly tracked – this is normal, not a failure of your setup.
What’s the real cost difference between building dashboards in-house versus using an embedded analytics platform?
Building from scratch requires front-end developers (chart libraries, UI design), back-end engineers (data pipelines, query optimization), and ongoing maintenance (scaling infrastructure, adding connectors as clients request them). Agencies report 6-12 months and $150K-300K in dev costs to build what Polymer or Onvo deliver out-of-box at $500-2000/month. The math flips if you need extreme customization or have 500+ clients – at that scale, in-house development pays off. For most teams under 100 clients, embedded platforms win on speed and cost. Just remember: these platforms still don’t unify your data sources automatically; you provide clean inputs via API or data warehouse connections.