The Adobe Analytics API is Adobe's REST interface for querying report data, managing segments and calculated metrics, auditing report suite configuration, and moving analytics data into your own data warehouse. It's the same API that powers the Analysis Workspace UI, and it's what you reach for when you've outgrown scheduled reports and data feeds.
If you've ever waited 4 hours for a Workspace report to render, then realized you need the same report 28 different ways for different stakeholders, you already know why the API matters. Scheduled reports in the UI cap out at 150 active. Data Feeds give you raw hits but nothing aggregated. The API fills the gap between those two.
Below, I'll cover what the Adobe Analytics API actually does in 2025, how authentication changed after JWT sunset, which endpoints you need, and the real workloads where the API pays for itself. I've used this API on BigQuery pipelines, mass segment migrations, and monthly report suite audits for European analytics teams since 2020.
Key Takeaways
Adobe Analytics has three APIs that matter in 2025: Reporting API 2.0 (current), Reporting API 1.4 (legacy, still works), and the Customer Journey Analytics API (separate product, separate endpoint). Most teams only need 2.0.
Authentication switched to OAuth Server-to-Server in 2025. JWT Service Account credentials are fully deprecated. If your data pipeline broke quietly in early 2025, this is why.
Rate limit is 12 requests per second per integration, 60 concurrent requests max. A single Workspace-style report may take 30+ seconds to resolve for large date ranges, so parallelism matters less than you'd think.
The highest-value workloads: daily automated pulls into BigQuery or Snowflake, segment migration between report suites, monitoring for configuration drift, dashboard backends where Workspace is too slow.
The API is included with any Adobe Analytics license. What you pay is engineering time plus occasional Data Warehouse API credits for huge exports (over 1 million rows).
What the Adobe Analytics API Actually Is
Adobe Analytics is a 20-year-old enterprise analytics platform with several parallel data access layers. Understanding which API to use starts with knowing which one answers your question fastest.
The Three APIs You Should Know About
Reporting API 2.0 (current, default). REST endpoints under analytics.adobe.io/api/{COMPANY_ID}. Used for anything you'd build in Analysis Workspace: reports, freeform tables, calculated metrics, segments. This is what "Adobe Analytics API" means 95% of the time in 2025.
Reporting API 1.4 (legacy, still works). The old SOAP/REST API from pre-Workspace days. Adobe still keeps it running for customers with old integrations. Don't build new pipelines on 1.4. If you inherit code that hits api.omniture.com or the legacy REST endpoints, that's 1.4.
Customer Journey Analytics (CJA) API. CJA is Adobe's newer product built on Experience Platform. It has its own API at analytics.adobe.io/api/{IMS_ORG}/reports (different shape, different concepts). If your team migrated to CJA, you're on a different API surface entirely. Most Adobe Analytics customers in 2025 are still on classic Analytics and use Reporting API 2.0.
There's also a Data Warehouse API for huge exports (over 1 million rows), a Real-Time API for live dashboards, and a Classification API for bulk classification management. You only touch these in specific edge cases.
What You Can Actually Do With Reporting API 2.0
- Run reports equivalent to anything in Analysis Workspace: freeform tables, cohort, flow, attribution.
- Manage segments: list, create, update, delete, share across report suites.
- Manage calculated metrics: same CRUD as segments, with the full function library.
- List metrics and dimensions available in a report suite (useful for dynamic dashboards).
- Query users in your org and their permissions.
- Query report suite settings: timezone, currency, visit definition, processing rules summary.
- Manage virtual report suites: list, create, update.
What you can't do with 2.0: modify tracking calls (that's AppMeasurement/Web SDK), change classifications in bulk (separate Classification API), or access the raw clickstream (that's Data Feeds, not an API).
Authentication: OAuth Server-to-Server
The authentication story is nearly identical to the Reactor API. If you read my Reactor API guide, you can skim this section.
JWT Service Account credentials were deprecated on January 1, 2025. Any Adobe Analytics API integration that hasn't been migrated to OAuth Server-to-Server as of early 2025 is broken.
The setup flow:
- Go to console.adobe.io and create a Project.
- Add Adobe Analytics as an API.
- Choose OAuth Server-to-Server credentials.
- Assign the integration to a Product Profile that has Adobe Analytics access (and the right report suite access). Permissions matter: the integration can only reach report suites assigned to its Product Profile.
- You receive a Client ID, Client Secret, Organization ID, and a scope list. The scopes you need:
openid,AdobeID,additional_info.projectedProductContext,session. - Exchange credentials for a 24-hour access token at ims-na1.adobelogin.com/ims/token/v3.
- Include the access token, Client ID, Organization ID, and your Company ID in request headers.
One Adobe Analytics-specific detail: you need the Global Company ID (a short alphanumeric string, not your IMS Org). Find it in Adobe Analytics under Admin > Company Settings, or call GET /discovery/me with your token and look for the globalCompanyId field.
Access token refresh, error handling, and retry logic work the same way as the rest of the Adobe I/O ecosystem.
Core Endpoints You'll Actually Use
Real Adobe Analytics API work touches maybe six endpoints, not the full surface. Here's what matters.
POST /reports
The workhorse. Send a JSON report definition, get back a rows/columns table. A minimal report definition includes rsid (report suite), globalFilters (date range and segments), metricContainer.metrics (what to count), and dimension (how to break it down).
For example, "Page Views by Page last 30 days for report suite acme-prod":
{
"rsid": "acme-prod",
"globalFilters": [{"type": "dateRange", "dateRange": "2025-08-27T00:00:00.000/2025-09-26T00:00:00.000"}],
"metricContainer": {"metrics": [{"columnId": "0", "id": "metrics/pageviews"}]},
"dimension": "variables/page"
}
Pagination: reports default to 10 rows. Real queries use settings.limit (max 50,000) and settings.page for pagination. For large datasets, paginate in parallel up to the rate limit.
GET /segments
Lists all segments available to the integration's Product Profile, across all authorized report suites. Supports filtering by rsids, tags, and ownerId. Most common use: pull all production-tagged segments for migration or audit.
POST /segments
Creates a new segment. The body is the segment definition in the same JSON shape Workspace uses internally. The safest way to generate this JSON is to build the segment in Workspace UI first, then export the definition via GET /segments/{id}?expansion=definition.
GET /calculatedmetrics and POST /calculatedmetrics
Same pattern as segments. Calculated metric definitions use a function tree (add, subtract, ratio, etc.) over base metrics. Like segments, easier to build in UI first and export the definition.
GET /collections/suites
Lists all report suites the integration can access, with key settings: timezone, currency, visit definition, hit count. Useful for dynamic dashboards that need to switch suites, and for configuration audits.
GET /dimensions and GET /metrics
Return the full list of dimensions and metrics (including eVars, sProps, and calculated metrics) available for a given report suite. Useful when you're building a dynamic report builder and need to know what's available before calling /reports.
Wondering if the API solves a specific reporting bottleneck you have? Get in touch for a scoped assessment. I'll tell you straight whether the API is the right fix or whether something in your Workspace setup is the real problem.
Real-World Use Cases
Four workloads where the Adobe Analytics API pays for itself in the first engagement.
Use Case 1: Daily Pipeline to BigQuery or Snowflake
The most common legitimate API workload. You want Adobe Analytics data sitting next to your CRM, subscription, and ad-spend data in a warehouse, joined by user or account ID.
When Dawid, a data engineering lead at a Polish SaaS company, started consolidating martech data into BigQuery earlier in 2025, his team had two options. Export Data Feeds (raw clickstream, 50GB per day, needed parsing) or build an API pipeline that pulled pre-aggregated reports daily.
They went with the API. A Python job running in Cloud Composer pulls 12 report definitions every morning at 6 AM UTC, for the previous day and a rolling 30-day window. The data lands in a BigQuery table partitioned by date. Stakeholders query it through Looker Studio and Sigma without hitting Workspace at all.
Development: 3 weeks. Monthly runtime cost: under 100 EUR. The alternative (Data Feeds parsing) was estimated at 4 months of engineering to get right.
Use Case 2: Mass Segment Migration Between Report Suites
If your organization has multiple report suites (one per brand, region, or environment), segments built in one don't automatically exist in others. Workspace lets you copy one segment at a time. Slow, manual, error-prone.
Aleksandra, a senior analyst at a European retail group, needed to port 78 segments from a master report suite to 6 regional suites in summer 2025. Manually, that's 468 segment recreations. Through the API, it's a 30-line script: GET each segment with expansion, POST it to each target suite with the rsid swapped.
Total API work: 2 days. Most of that was writing validation to confirm the segments resolved identically across suites.
Use Case 3: Configuration Drift Monitoring
Enterprise Adobe Analytics deployments have dozens of report suites, each with its own variable definitions, processing rules, and marketing channel settings. Configuration drift between "sibling" suites is the #1 cause of reports that look right but aren't.
A monthly API audit script can pull every report suite's settings, every eVar/sProp/event definition, every processing rule, and diff against a reference suite. Changes flagged for review.
When I ran this audit for a financial services client in July 2025, the script flagged 34 silent changes across their 9 report suites that nobody in the analytics team knew about. Two of them were causing a 15% data discrepancy on a key funnel metric. The team had been debugging "mysterious" variance for 3 weeks before the audit explained it in a single run.
Use Case 4: Custom Dashboards Where Workspace Is Too Slow
Workspace is great for exploratory analysis. It's bad at 9 AM Monday when 40 people are loading the same dashboard and each request takes 20 seconds.
The fix: pre-compute the dashboard's reports via the API overnight, cache the results in Redis or a dedicated table, and serve the dashboard from cache. Load time drops from 20 seconds to sub-second. Workspace becomes the exploratory tool, your custom dashboard becomes the daily-use tool.
Have a specific data-access bottleneck in Adobe Analytics? See my services for scoped API work, pipeline development, and audit engagements.
Rate Limits, Errors, and Best Practices
Adobe Analytics API rate limit: 12 requests per second per integration, with a 60-request concurrency cap. In practice, the bottleneck for most teams is report execution time (Adobe runs your query on their backend), not the rate limit itself.
Response codes you'll see:
200: report ran, data in body.202 Accepted: for anomaly detection reports, you get a ticket to poll. Most teams don't hit this.400: malformed report JSON. The error body usually names the offending field.401: token expired or invalid. Refresh and retry once.403: your integration's Product Profile doesn't have access to the requested report suite.408: report took too long to execute (typically over 10 minutes). Narrow the date range or add filters.429: rate-limited. Back off.500: Adobe backend error. Retry with exponential backoff. If it persists for a specific report, the query is probably exceeding an internal limit on breakdowns or row count.
Best practice 1: build reports in Workspace first, export the JSON definition. Almost every Adobe Analytics API report can be authored in Workspace and exported. There's no reason to hand-write the JSON.
Best practice 2: check your date ranges. Adobe Analytics reports run on the report suite's timezone, not UTC. Running a "yesterday" report from a UTC cron job in a US Pacific-timezone report suite produces 17 hours of yesterday and 7 hours of the day before. Always explicitly construct dates in the report suite timezone.
Best practice 3: paginate with settings.page and cache results. A large breakdown (say, all pages for a 90-day window) might be 500,000 rows. Pull it once, cache it, query the cache. Don't re-run the same report hourly.
Best practice 4: use the community aanalytics2 Python SDK. Adobe doesn't ship an official Python client, but the open-source aanalytics2 library (by Julien Piccini) is widely used, actively maintained, and handles pagination, token refresh, and the quirks of 2.0 report definitions. Saves weeks of development.
When NOT to Use the API
Honest answer: most Adobe Analytics users should stay in Workspace.
- If your reporting needs are served by 10 scheduled reports, don't build an API pipeline. Workspace scheduled reports email PDFs or CSVs without any code.
- If you need raw clickstream data, Data Feeds or Customer Journey Analytics source connections are better than the API. The API returns aggregated reports, not hits.
- If your team has no Python or Node.js capacity, the API will rot. The initial pipeline runs fine, but token refresh changes, Adobe deprecates a parameter, and six months later nothing works.
- If you're evaluating whether Adobe Analytics is the right platform at all, start with the Adobe Analytics guide and Adobe Analytics help first. API work only makes sense once the core Workspace implementation is healthy.
The API rewards teams with structural reporting needs. If your data needs fit in Workspace, stay there.
Frequently Asked Questions
What's the difference between Adobe Analytics Reporting API 1.4 and 2.0?
API 2.0 is the modern Workspace-era API, released in 2017 and now the default. API 1.4 is the legacy SOAP/REST API from the Omniture days. Both still work, but 2.0 has all the new Workspace concepts (freeform tables, attribution IQ, anomaly detection). Don't build new integrations on 1.4. If you inherit a legacy integration, plan to migrate.
Do I need a separate Adobe Analytics license for the API?
No. API access is included with every Adobe Analytics license. You may need specific product permissions for a service account, but there's no additional cost for API access itself.
How do I authenticate to the Adobe Analytics API today?
OAuth Server-to-Server via Adobe Developer Console. Create a Project at console.adobe.io, add the Adobe Analytics API, generate OAuth Server-to-Server credentials, and exchange them for a 24-hour access token at ims-na1.adobelogin.com/ims/token/v3. JWT Service Account authentication was deprecated January 1, 2025.
What are the Adobe Analytics API rate limits?
12 requests per second per integration, with a 60-request concurrency cap. Most teams bottleneck on report execution time (queries running on Adobe's backend), not on the rate limit itself.
Can I use the Adobe Analytics API to get raw hit-level data?
No, the API returns aggregated reports, equivalent to what you see in Workspace. For raw clickstream, you need Data Feeds or Customer Journey Analytics source connections.
What's the best Python client for the Adobe Analytics API?
The community-maintained aanalytics2 library by Julien Piccini is the most complete Python client today. Adobe ships no official Python SDK. aanalytics2 handles authentication, pagination, segment and calculated metric management, and the quirks of Reporting API 2.0 report definitions.
Conclusion
The Adobe Analytics API is the answer when Workspace and scheduled reports aren't enough. The three workloads where it pays off within the first engagement: a daily pipeline to your data warehouse, mass segment migration between suites, and monitoring for configuration drift. If your reporting fits inside Workspace, stay there. If your team doesn't have a Python or Node.js developer, don't start.
Start by building one report in Workspace, exporting its JSON definition, and running it through the API via Postman or curl. Once you see the response shape, you understand 80% of what this API does. The rest is pagination, segments, and schedule management.
Thinking about whether the Adobe Analytics API belongs in your stack? Get in touch for a scoped assessment. I'll review your current reporting setup, identify whether the bottleneck is actually an API problem, and give you a realistic engineering estimate. No pitch deck, no upsell.
Meta Elements
```
Meta Title: Adobe Analytics API (Reporting 2.0): 2025 Guide
Meta Description: Adobe Analytics Reporting API 2.0 in 2025: OAuth auth, endpoints, rate limits, BigQuery pipelines, and the use cases where it actually pays off.
Primary Keyword: Adobe Analytics API
Secondary Keywords: Adobe Analytics Reporting API 2.0, analytics.adobe.io, Adobe Analytics data export, CJA API
URL Slug: /articles/en/adobe-analytics-api-guide.html
Internal Links:
- https://piotrlitwa.com/articles/en/reactor-api-adobe-launch-api-guide.html
- https://piotrlitwa.com/articles/en/adobe-analytics-what-is-it-complete-guide.html
- https://piotrlitwa.com/articles/en/adobe-analytics-help.html
- https://piotrlitwa.com/services.html#ga4 (x3 CTAs)
External Links:
- console.adobe.io (Adobe Developer Console)
- analytics.adobe.io (API endpoint)
- ims-na1.adobelogin.com (OAuth token endpoint)
Word Count: ~2700
```
Need an honest Adobe Analytics API assessment?
I review Adobe Analytics reporting setups and scope API work that actually solves the bottleneck. No slide decks, no upsell.
See my servicesNeed help? Get in touch
Have a question about your analytics setup? Fill out the form, I usually reply within 24 hours.