Ready to get started?

Try OpenObserve Cloud today for more efficient and performant observability.

Table of Contents
RUM Source Maps in OpenObserve

The Problem Every Frontend Team Faces

You ship your application to production. Vite (or Webpack, or Rollup) bundles your code into optimized, minified files. Your beautiful src/components/CheckoutForm.vue becomes chunk-8f3a2b.js. Your descriptive variable names become single letters. Your 200 lines of checkout logic become a single wall of minified text.

Then an error happens in production:

TypeError: Cannot read properties of undefined (reading 'discount')
  at setup/b/< @ https://app.example.com/assets/CheckoutForm-RC3okFHd.js:1:338
  at setup/b/< @ https://app.example.com/assets/CheckoutForm-RC3okFHd.js:1:538
  at b @ https://app.example.com/assets/CheckoutForm-RC3okFHd.js:1:542

Line 1, column 338. That's your entire application on one line. Good luck finding the bug.

This is where source maps change everything. OpenObserve's RUM module lets you upload your .map files and automatically transforms these cryptic stack traces into readable, debuggable code with original filenames, line numbers, function names, and even the surrounding source code context.


What Are Source Maps?

Source maps are JSON files (.js.map) generated by your build tool alongside the minified output. They contain a mapping between every position in the minified file and the corresponding position in the original source code.

When your bundler outputs:

dist/
  CheckoutForm-RC3okFHd.js      ← minified, deployed to production
  CheckoutForm-RC3okFHd.js.map  ← source map, NOT deployed to production

The .js.map file contains:

  • Original source file paths (e.g., src/components/CheckoutForm.vue)
  • Position mappings (minified line:col → original line:col)
  • Original function/variable names
  • Embedded source content (optionally, the full original source code)

Source maps are generated by every modern bundler Vite, Webpack, Rollup, esbuild, Parcel, and SWC all support them. You almost certainly generate them already; you just might not be using them for production debugging.


How Source Maps Work in OpenObserve

The Architecture

OpenObserve's source map system has three parts:

  1. Upload & storage You upload a ZIP containing your .js and .js.map files, tagged with service name, version, and environment
  2. Matching When an error occurs, OpenObserve matches the minified filename in the stack trace to the correct source map using the service/version/environment metadata
  3. Translation The backend resolves each stack frame position through the source map and returns the original file, line, column, function name, and surrounding source code

The Three-Dimensional Matching System

Source maps are matched to errors using three dimensions:

Dimension Purpose Example
Service Which application produced the error web-app, admin-dashboard, mobile-web
Version Which build/release of that application 2.4.1, 1.0.0-beta.3, abc123
Environment Which deployment environment production, staging, development

This three-dimensional matching means you can:

  • Debug errors from different applications (microservice frontends)
  • Debug errors from different releases running simultaneously (gradual rollouts)
  • Debug errors from staging vs. production separately

When an error arrives, OpenObserve extracts the minified filename from the stack trace (e.g., CheckoutForm-RC3okFHd.js), then queries:

WHERE org = 'your-org'
  AND source_file_name = 'CheckoutForm-RC3okFHd.js'
  AND service = 'web-app'          -- from error metadata
  AND version = '2.4.1'            -- from error metadata
  AND env = 'production'           -- from error metadata

If a match is found, the source map is fetched and used for deobfuscation. If not, the original (minified) stack trace is displayed unchanged.

This is why getting the service/version/environment values right is critical they must match between your SDK configuration and your source map upload.


Step-by-Step Setup

Step 1: Configure Your RUM SDK

When initializing @openobserve/browser-rum, set the service, env, and version fields to match your build:

import { openobserveRum } from '@openobserve/browser-rum';

openobserveRum.init({
  applicationId: 'checkout-app',
  clientToken: 'your-rum-token',          // from Settings → RUM Token
  site: 'https://cloud.openobserve.ai',
  organizationIdentifier: 'your-org',

  // These three fields MUST match your source map upload:
  service: 'web-app',
  env: 'production',
  version: '2.4.1',

  insecureHTTP: false,
  apiVersion: 'v1',
});

Tip: Use your git commit SHA or a build-injected variable as the version so every deployment is uniquely identifiable and stays in sync automatically:

// Injected at build time via Vite's `define` or an environment variable
openobserveRum.init({
  // ...
  version: __APP_VERSION__,  // e.g., 'a1b2c3d' (git short SHA) or '2.4.1'
});

Using package.json version works if you bump it on every release, but a git SHA is safer it guarantees uniqueness even for hotfix rebuilds of the same version.

Step 2: Generate Source Maps in Your Build

Most bundlers generate source maps by default or with minimal configuration:

Vite (vite.config.ts):

export default defineConfig({
  build: {
    sourcemap: true,   // generates .js.map files in dist/
  },
});

Webpack (webpack.config.js):

module.exports = {
  devtool: 'source-map',  // generates .js.map files
};

Important: Do NOT deploy .map files to your production CDN/server. Source maps contain your original source code and should only be uploaded to OpenObserve, not served to browsers.

Step 3: Package Source Maps as a ZIP

After your build completes, create a ZIP containing the minified .js files and their corresponding .js.map files:

cd dist/assets
zip -r sourcemaps.zip *.js *.js.map

The ZIP structure should look like:

sourcemaps.zip
├── CheckoutForm-RC3okFHd.js
├── CheckoutForm-RC3okFHd.js.map
├── vendor-B7kx9Dq2.js
├── vendor-B7kx9Dq2.js.map
├── index-Lm4nPq8R.js
└── index-Lm4nPq8R.js.map

Rules:

  • Every .js file must have a matching .js.map file (same name + .map extension)
  • If a .js.map file exists without a .js counterpart, it will be accepted (graceful fallback)
  • If a .js file has no .map file, the upload will fail with a validation error
  • Only .zip format is supported

Step 4: Upload via the UI

  1. Navigate to RUM → Source Maps in OpenObserve
  2. Click Upload Source Maps
  3. Fill in the fields:
    • Service (required): Must match your SDK's service value (e.g., web-app)
    • Version (required): Must match your SDK's version value (e.g., 2.4.1)
    • Environment (optional): Must match your SDK's env value (e.g., production)
  4. Drag and drop your ZIP file (or click to select)
  5. Click Upload

The UI shows the selected filename and file size before upload. You can remove and re-select if needed.

Step 4 (Alternative): Upload via the API

For CI/CD integration, use the REST API:

curl -X POST "https://cloud.openobserve.ai/api/your-org/sourcemaps" \
  -H "Authorization: Bearer $OPENOBSERVE_TOKEN" \
  -F "service=web-app" \
  -F "version=2.4.1" \
  -F "env=production" \
  -F "file=@dist/sourcemaps.zip"

This returns 201 Created on success. Integrate this into your deployment pipeline to ensure source maps are uploaded with every release.

CI/CD Example (GitHub Actions):

- name: Build
  run: npm run build

- name: Package source maps
  run: cd dist/assets && zip -r ../../sourcemaps.zip *.js *.js.map

- name: Upload source maps to OpenObserve
  run: |
    curl -X POST "${{ secrets.O2_URL }}/api/${{ secrets.O2_ORG }}/sourcemaps" \
      -H "Authorization: Bearer ${{ secrets.O2_TOKEN }}" \
      -F "service=web-app" \
      -F "version=${{ github.sha }}" \
      -F "env=production" \
      -F "file=@sourcemaps.zip"

Step 5: Verify

After uploading, go back to RUM → Source Maps. You should see your upload listed with:

  • Service name
  • Version
  • Environment
  • Number of file pairs
  • Upload timestamp

Click the row to expand and see all individual file pairs (e.g., CheckoutForm-RC3okFHd.js + CheckoutForm-RC3okFHd.js.map).


The Deobfuscation Experience

Before: The Minified Stack Trace

In RUM → Error Tracking, you see an error with this stack trace:

TypeError: Cannot read properties of undefined (reading 'discount')
  at setup/b/< @ https://app.example.com/assets/CheckoutForm-RC3okFHd.js:1:338
  at setup/b/< @ https://app.example.com/assets/CheckoutForm-RC3okFHd.js:1:538
  at b @ https://app.example.com/assets/CheckoutForm-RC3okFHd.js:1:542

Line 1, column 338. Every function is named b. You have no idea what's happening.

After: The Pretty Stack Trace

Click the "Pretty" tab on the stack trace panel. OpenObserve sends the stack trace to the backend, which:

  1. Parses each line to extract the filename, line, and column
  2. Looks up the source map for CheckoutForm-RC3okFHd.js matching your service/version/env
  3. Resolves each position through the source map (converting 1-indexed to 0-indexed for the sourcemap library)
  4. Returns the original file path, line number, column, function name, and 11 lines of source context (5 before + the error line + 5 after)

The result:

TypeError: Cannot read properties of undefined (reading 'discount')
  at applyDiscount @ src/components/CheckoutForm.vue:56:17
  at calculateTotal @ src/components/CheckoutForm.vue:49:3
  at handleSubmit @ src/components/CheckoutForm.vue:45:3

Now you know exactly where the error is: CheckoutForm.vue, line 56, in the applyDiscount function.

Source Code Context

The Pretty Stack Trace view goes further than just file and line number. For each frame:

  • The first frame is expanded by default, showing the surrounding source code
  • A syntax-highlighted code editor (Monaco) displays the source context with the error line highlighted
  • Additional frames are collapsed behind a "Show N more frames" button
  • Each frame can be individually expanded to inspect its source context

This means you can read the actual code around the error seeing what discount was supposed to be, where it was expected to come from, and what the surrounding logic looks like. All without opening your IDE.

Caching

Stack trace translations are cached on the frontend for 1 hour to avoid redundant API calls:

  • Cache key: orgId::hashOfStackTrace::service::version::env
  • Max entries: 50 in memory when full, the oldest 10 entries (by timestamp) are evicted in a batch
  • Revisiting the same error within an hour loads the deobfuscated trace instantly

Managing Source Maps

Listing and Filtering

The RUM → Source Maps page shows all uploaded source maps grouped by service/version/environment. You can filter by:

  • Service dropdown of all services with uploaded maps (top 10)
  • Version dropdown of all versions (top 10)
  • Environment dropdown of all environments (top 10)

Filter values are fetched dynamically from the API. Pagination supports 20, 50, 100, or 250 rows per page.

Deleting Source Maps

To delete source maps for a specific release:

  1. Find the row in the Source Maps list
  2. Click the delete icon in the Actions column
  3. Confirm in the dialog

This deletes all file pairs for that service/version/environment combination. Deletion is atomic you can't delete individual files within a group.

Via API:

curl -X DELETE "https://cloud.openobserve.ai/api/your-org/sourcemaps?service=web-app&version=2.4.1&env=production" \
  -H "Authorization: Bearer $OPENOBSERVE_TOKEN"

All three parameters must match exactly for deletion to proceed this prevents accidental bulk deletes.

Storage and Architecture

Under the hood, source maps are stored as follows:

  1. Each .js.map file is assigned a UUID-based storage name (e.g., a1b2c3d4-e5f6.js.map)
  2. Files are stored in OpenObserve's object storage (S3, MinIO, or local filesystem, depending on your deployment)
  3. A database entry maps the original filename to the storage UUID, along with service/version/env metadata
  4. An LRU cache keeps frequently-accessed source maps in memory for fast translation

In enterprise cluster deployments, source maps are replicated across nodes:

  • If a translation request hits a node that doesn't have the source map locally, it fetches it via gRPC from the node that does
  • The fetched file is cached locally to avoid repeated cross-node requests
  • Add/delete operations are broadcast via cluster watch events so all nodes update their caches

Real-World Example: From Upload to Debug

Here's the actual flow demonstrated with a real stack trace from the OpenObserve test suite:

1. Original minified error:

TypeError: can't access property "nonExistent", e is undefined
  at setup/b/< @ http://localhost:4173/assets/AboutView-RC3okFHd.js:1:338
  at setup/b/< @ http://localhost:4173/assets/AboutView-RC3okFHd.js:1:538
  at b @ http://localhost:4173/assets/AboutView-RC3okFHd.js:1:542

2. After source map translation:

TypeError: can't access property "nonExistent", e is undefined
  at obj @ ../../src/components/ErrorDemo.vue:56:17
  at fn3 @ ../../src/components/ErrorDemo.vue:49:3
  at fn2 @ ../../src/components/ErrorDemo.vue:45:3

3. In the Pretty Stack Trace UI, the first frame expands to show:

51│
52│  const obj = (input: any) => {
53│    // This will throw a TypeError because input is undefined
54│    // and we're trying to access 'nonExistent' property
55│    const nested = input.nested;
56│    return nested.nonExistent.deep.value;   // ← ERROR LINE (highlighted)
57│  };
58│
59│  const fn3 = () => {
60│    return obj(undefined);
61│  };

Now you can see: the function receives undefined as input, tries to access input.nested, gets undefined, then tries to access .nonExistent on undefined. The fix is obvious add a null check.


CI/CD Integration Best Practices

Automate Upload in Your Deployment Pipeline

Never rely on manual uploads. Source maps should be uploaded automatically as part of every deployment:

# In your deploy script:
npm run build
cd dist/assets && zip -r ../../sourcemaps.zip *.js *.js.map && cd ../..
curl -X POST "$O2_URL/api/$O2_ORG/sourcemaps" \
  -H "Authorization: Bearer $O2_TOKEN" \
  -F "service=$SERVICE_NAME" \
  -F "version=$(git rev-parse --short HEAD)" \
  -F "env=$DEPLOY_ENV" \
  -F "file=@sourcemaps.zip"

Use Git SHA as Version

Using your git commit SHA (or short SHA) as the version ensures every build has a unique version identifier:

// vite.config.ts
export default defineConfig({
  define: {
    __APP_VERSION__: JSON.stringify(process.env.COMMIT_SHA || 'dev'),
  },
});

// main.ts
openobserveRum.init({
  version: __APP_VERSION__,
  // ...
});

Clean Up Old Source Maps

Source maps accumulate over time. Set up a periodic cleanup job to delete maps older than your retention window:

# Delete source maps for old versions via the management API
# Keep maps for the last N versions or last 30 days

Don't Ship Source Maps to Production

Your build should generate source maps but not include them in the deployed assets. However, the .map files you upload to OpenObserve must contain embedded source content (sourcesContent) this is what powers the syntax-highlighted source code context in the Pretty Stack Trace view.

By default, most bundlers embed sourcesContent in .map files, so no extra configuration is needed:

// vite.config.ts
export default defineConfig({
  build: {
    sourcemap: true,  // generates .map files with sourcesContent included by default
  },
});

Important: Do NOT set sourcemapExcludeSources: true in your Rollup/Vite config. This strips the sourcesContent field from .map files, which means OpenObserve can resolve file paths and line numbers but cannot display the surrounding source code context. If you need smaller .map files for other tooling, generate two builds one with full source content for OpenObserve upload.

After building, upload the .map files to OpenObserve, then exclude them from your deployment artifact. Your production server should never serve .map files to browsers they contain your original source code.


Troubleshooting

"No Source Maps Found" Message

If the Pretty Stack Trace shows a "No source maps" message with service/version badges:

  1. Check the service name does the badge show the same service name you uploaded with?
  2. Check the version does it match exactly? (e.g., 2.4.1 vs v2.4.1)
  3. Check the environment if you specified env in the SDK, did you also specify it during upload?
  4. Verify the upload go to RUM → Source Maps and confirm your upload is listed
  5. Check file naming the minified filename in the stack trace must exactly match the .js filename in your ZIP

The most common issue is a version mismatch. If your SDK sends version: '2.4.1' but you uploaded with version: 'v2.4.1', no match will be found.

Stack Trace Lines Not Resolving

If some lines resolve but others don't:

  • The unresolved files might not have been included in the ZIP
  • The source map might not contain mappings for that specific position (rare, usually indicates a source map generation issue)
  • Third-party library code (e.g., from node_modules) may not have source maps in your bundle

Upload Fails

  • "Every .js file must have a matching .js.map" your ZIP contains a .js file without a corresponding .map file. This fails the entire upload either add the missing map or remove the orphaned .js before zipping. Filter your ZIP to only include files that have matching pairs:
    cd dist/assets
    for f in *.js; do [ -f "$f.map" ] && echo "$f" "$f.map"; done | xargs zip sourcemaps.zip
    
  • File too large individual .map files cannot exceed 5 MB, and the total ZIP cannot exceed 100 MB. If your source maps are very large, consider splitting into multiple ZIPs per route/chunk
  • Not a ZIP only .zip format is accepted

Conclusion

Minified stack traces are the single biggest obstacle to debugging production JavaScript errors quickly. Without source maps, you're reverse-engineering compressed code. With them, you're reading the original source with context, line numbers, and function names.

OpenObserve makes the source map workflow straightforward:

  1. Upload a ZIP via the UI or API, tagged with service/version/environment
  2. Errors are automatically deobfuscated when you view them in the Pretty Stack Trace tab
  3. Syntax-highlighted source context shows the exact code around the error
  4. Three-dimensional matching handles multi-app, multi-version, multi-environment deployments
  5. CI/CD-friendly API integrates into any deployment pipeline with a single curl command

Stop squinting at chunk-8f3a.js:1:28432. Upload your source maps and start debugging production errors in seconds.


Ready to get started?

Related posts:

Frequently Asked Questions

About the Authors

Bhargav Patel

Bhargav Patel

LinkedIn

Bhargav Patel is a frontend-focused Software Engineer working on observability platforms. He builds seamless user experiences for visualizing and interacting with system data like logs, metrics, and traces. His focus is on performance, usability, and developer experience.

Simran Kumari

Simran Kumari

LinkedIn

Passionate about observability, AI systems, and cloud-native tools. All in on DevOps and improving the developer experience.

Latest From Our Blogs

View all posts