react-native-turboxml: Under the Hood of a Native XML Parser

Introduction
If you've ever tried parsing multi-megabyte XML files in a React Native app, you know the pain: the UI freezes, animations stutter, and users wonder if your app has crashed. JavaScript-based XML parsers like react-native-xml2js run on the JS thread, blocking everything else while they churn through your data.
react-native-turboxml solves this problem by moving XML parsing entirely into native code. Built with Kotlin (Android) and Objective-C (iOS), it leverages React Native's New Architecture and TurboModules to deliver faster parsing without blocking your UI.
TurboXML (left) keeps the UI smooth while fast-xml-parser (right) freezes the entire app.
In this deep dive, we'll explore exactly how it works under the hood—and why it's the best choice for XML-heavy React Native apps.
The Problem with JavaScript XML Parsers
Before diving into the solution, let's understand why traditional XML parsing is so problematic in React Native:
-
Single-threaded JavaScript: React Native's JS engine runs on a single thread. When you parse a large XML file, nothing else can execute—no animations, no user interactions, nothing.
-
Bridge overhead: The old architecture requires serializing data across the bridge between JS and native code. For large XML files, this serialization alone can take seconds.
-
Memory pressure: JavaScript parsers often create intermediate string representations, doubling or tripling memory usage for large files.
-
No parallelization: JavaScript can't utilize multiple CPU cores for parsing, leaving performance on the table.
Real-world impact: In our testing with a 5MB XML configuration file, react-native-xml2js took 35 seconds to parse. The same file with react-native-turboxml? Under 5 seconds.
Why TurboModules Change Everything
React Native's New Architecture introduces TurboModules, which provide:
- Direct native calls via JSI (JavaScript Interface)—no more bridge serialization
- Lazy loading of native modules
- Type-safe communication between JS and native code
- Synchronous and asynchronous method support
react-native-turboxml is built from the ground up for TurboModules, meaning it bypasses the old bridge entirely and communicates directly with native code.
1. Module Structure & Spec
The module implements the generated NativeTurboxmlSpec interface. Its name "Turboxml" makes it available to JavaScript as:
import { parseXml } from 'react-native-turboxml'
On both Android and iOS, no manual C++ wiring is needed—React Native's New Architecture toolchain generates the bridge automatically from the spec.
TypeScript support is built-in:
function parseXml(xml: string): Promise<Record<string, unknown>>
2. parseXml Implementation
Android (Kotlin)
The Android implementation uses Kotlin coroutines and Jackson's XmlMapper for maximum performance:
- Launches a coroutine on
Dispatchers.Default(background thread pool) - Reads raw XML into a Kotlin
Mapvia Jackson'sXmlMapper - Cleans and normalizes the data
- Wraps it under the root tag and converts to React Native's
WritableMap - Resolves—or rejects—back on the main thread
override fun parseXml(xml: String, promise: Promise) { CoroutineScope(Dispatchers.Default).launch { try { val parsed = xmlMapper.readValue<Map<String, Any?>>(xml) val cleaned = clean(parsed) val normalized = normalize(cleaned) val rootTag = extractRoot(xml) val wrapped = mapOf(rootTag to normalized) val result = toWritableMap(wrapped) withContext(Dispatchers.Main) { promise.resolve(result) } } catch (e: Exception) { withContext(Dispatchers.Main) { promise.reject("XML_PARSE_ERROR", e.message, e) } } } }
Why Jackson? Jackson's XmlMapper is one of the fastest XML parsers available on the JVM. It's battle-tested in enterprise Java applications and handles edge cases gracefully.
iOS (Objective-C)
On iOS, the implementation uses NSXMLParser with Grand Central Dispatch:
- Dispatches parsing to a background queue using GCD
- Parses XML using
NSXMLParserwith delegate callbacks - Builds nested
NSDictionarystructures during parsing - Resolves the promise on the main thread
RCT_EXPORT_METHOD(parseXml:(NSString *)xml resolve:(RCTPromiseResolveBlock)resolve reject:(RCTPromiseRejectBlock)reject) { dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{ NSData *data = [xml dataUsingEncoding:NSUTF8StringEncoding]; NSXMLParser *parser = [[NSXMLParser alloc] initWithData:data]; TurboXMLParserDelegate *delegate = [[TurboXMLParserDelegate alloc] init]; parser.delegate = delegate; if ([parser parse]) { dispatch_async(dispatch_get_main_queue(), ^{ resolve(delegate.result); }); } else { dispatch_async(dispatch_get_main_queue(), ^{ reject(@"XML_PARSE_ERROR", @"Failed to parse XML", parser.parserError); }); } }); }
Why NSXMLParser? It's Apple's built-in SAX parser—memory-efficient, fast, and doesn't require any external dependencies.
3. Helper Functions
Both platforms implement similar helper functions to ensure consistent output:
-
extractRoot: Uses a simple regex to find the top-level tag name, preserving the XML structure in the output. -
clean: Recursively removes blank keys, empty strings, and empty lists. This prevents cluttered output from XML files with optional fields. -
normalize: Ensures every value is wrapped consistently for predictable access patterns in JavaScript. -
toWritableMap/toWritableArray: Convert native collections into React Native'sWritableMapandWritableArrayfor seamless JS consumption.
4. Multithreading: The Secret Sauce
The real performance gains come from true parallel processing:
Android
Kotlin coroutines with Dispatchers.Default automatically distribute work across a thread pool sized to your device's CPU cores. On a modern 8-core phone, this means up to 8Ă— theoretical parallelization.
iOS
Grand Central Dispatch (GCD) manages a pool of threads optimized for the device. The DISPATCH_QUEUE_PRIORITY_DEFAULT queue balances performance with battery efficiency.
Benefits on both platforms:
- Non-blocking UI: Your app stays responsive during parsing
- Multi-core utilization: Takes advantage of modern mobile CPUs
- Memory efficiency: Native parsers use less memory than JS alternatives
- Battery friendly: Native code is more power-efficient than JS execution
5. Performance Benchmarks
Here's how react-native-turboxml compares to popular alternatives:
| Parser | 1MB XML | 5MB XML | 10MB XML |
|---|---|---|---|
| react-native-xml2js | 2.1s | 35s | 78s |
| fast-xml-parser (JS) | 1.8s | 28s | 62s |
| react-native-turboxml | 0.4s | 4.8s | 9.2s |
Tested on iPhone 14 Pro and Pixel 7, averaged across 10 runs.
That's 4-8× faster depending on file size—and more importantly, your UI never freezes.
6. Real-World Use Cases
react-native-turboxml shines in these scenarios:
Offline Maps & Navigation
GPS/mapping apps often use XML for offline map data (OSM format). Parsing these multi-megabyte files needs to be fast and non-blocking.
Enterprise Configuration
Large configuration files from enterprise backends are often XML-based. Quick parsing means faster app startup.
Data Synchronization
Apps that sync large datasets (inventory, catalogs, schedules) benefit from parsing that doesn't freeze the UI during sync.
Legacy API Integration
Many enterprise and government APIs still return XML. TurboXML lets you handle these without performance penalties.
7. Integration & Usage
Getting started is simple:
Install
npm install react-native-turboxml # or yarn add react-native-turboxml # For iOS cd ios && pod install
Requirements
- React Native 0.71+
- New Architecture enabled
- Android 5.0+ or iOS 13.0+
Basic Usage
import { parseXml } from 'react-native-turboxml' const xmlString = ` <config> <app> <name>MyApp</name> <version>2.0.0</version> </app> <features> <feature enabled="true">Dark Mode</feature> <feature enabled="false">Beta Features</feature> </features> </config> ` parseXml(xmlString) .then((data) => { console.log(data.config.app.name) // "MyApp" }) .catch((err) => console.error('Parse error', err))
With Async/Await
async function loadConfig() { try { const response = await fetch('https://api.example.com/config.xml') const xmlString = await response.text() const config = await parseXml(xmlString) return config } catch (error) { console.error('Failed to load config:', error) throw error } }
Error Handling
import { parseXml } from 'react-native-turboxml' try { const data = await parseXml(malformedXml) } catch (error) { if (error.code === 'XML_PARSE_ERROR') { // Handle malformed XML console.error('Invalid XML:', error.message) } }
8. Best Practices
To get the most out of react-native-turboxml:
-
Don't hold giant strings in memory: If possible, stream large files or process them in chunks.
-
Parse early, cache results: Parse XML once at startup or sync time, then work with the JavaScript objects.
-
Use TypeScript: The built-in types help catch errors early and improve IDE autocomplete.
-
Handle errors gracefully: XML from external sources can be malformed. Always wrap parsing in try/catch.
Conclusion
react-native-turboxml delivers what React Native developers have needed for years: fast, non-blocking XML parsing that works on both platforms.
By leveraging native code, TurboModules, and platform-specific optimizations, it achieves:
- 4Ă— faster parsing than JavaScript alternatives
- Zero UI blocking thanks to background threading
- Cross-platform support for Android and iOS
- Simple API with full TypeScript support
- Production-ready with proper error handling
If your React Native app deals with XML—whether it's configuration files, API responses, or offline data—react-native-turboxml is the performance upgrade you've been waiting for.
Get Started Today
npm install react-native-turboxml
Have questions or feedback? Open an issue on GitHub or reach out—contributions are welcome!