Connect with us

Infra

Trace Machina: Simulation Infrastructure Company Raises $4.7 Million (Seed)

Published

on

Trace Machina: Simulation Infrastructure Company Raises .7 Million (Seed)

Trace Machina, a company building simulation infrastructure for safety-critical technologies in physical-world AI, launched out of stealth mode with $4.7 million in funding. Wellington Management led the seed funding round with participation from Samsung Next, Sequoia Capital Scout Fund, Green Bay Ventures, and Verissimo Ventures. And angel investors include Clem Delangue (CEO of Hugging Face); Mitch Wainer (Co-Founder of DigitalOcean); Gert Lackriet (Director of Applied Machine Learning at Amazon); and other industry leaders from OpenAI and MongoDB.

Trace Machina features engineers and product leaders from Apple, Google, MongoDB, and Toyota Research Institute. The company’s first product, NativeLink, is open source and has already surpassed 1,000 stars on GitHub, with contributing engineers from Tesla, General Motors (Cruise), Samsung, and X.

NativeLink offers an advanced staging environment for futuristic technologies where safety and testing are critical, such as self-driving cars, aviation, robotics, and other autonomous hardware systems. And the magic behind NativeLink is that it brings AI to the edge and turns local devices into supercomputers.

Before launching Trace Machina, Eagan worked on MongoDB Atlas Vector Search (the company’s first AI product). And he has also contributed to some of the largest and most widely adopted open-source projects in the world, like Lucene, Solr, and Superset. Most recently, Eagan finished a project with Nvidia that enables engineers to run Lucene on GPUs and index data 20X faster. And Nathan Bruer (Chief Architect and Co-Founder of Trace Machina) worked on several secret projects at Google’s X and built autonomous driving software as an engineer at the Toyota Research Institute.

The key benefits of NativeLink include

1.) AI at the edge – For developers building native mobile applications, NativeLink’s technology is free and it avoids the costs and latency associated with traditional cloud infrastructure. This drives cloud cost savings typically of between 50 to 70% for next-generation companies who need to test their code on GPUs in order to build AI applications.

2.) Increased productivity – NativeLink’s feedback loop for building futuristic systems speeds up compilation, testing, simulations, and other workflows by up to 80% through efficient caching to avoid rerunning unchanged code, and parallel processing with remote execution – which can result in reductions in expensive flakiness of tests on GPUs and other specialized hardware.

3.) Built for scale – NativeLink’s Rust-based architecture eliminates potential errors, race conditions, and stability issues at scale, while improving the reliability of critical development pipelines where re-running tests on GPU causes inflated cloud costs.

Leading companies are already utilizing NativeLink in production, including Menlo Security, CIQ (the founding sponsor of Rocky Linux), and Samsung who is an investor in the company. Samsung uses NativeLink to compile, test, and validate software at the operating system level for over a billion devices, with NativeLink servicing billions of requests per month.

KEY QUOTES:

“We’re enabling the next generation to more easily build futuristic technologies like what you read about in sci-fi novels or see in films. This has historically been unattainable or uneconomical due to limitations of existing infrastructure and tools. We’re moving beyond machine learning solely focused on language and pattern matching to a new wave of AI that’s more human-like in its ability to maneuver around obstacles and modify objects.”

  • Marcus Eagan, CEO and Co-Founder of Trace Machina

“NativeLink is providing critical infrastructure for the industries of tomorrow such as aerospace and autonomous mobility. Over 100 companies are using NativeLink’s Cloud service to significantly enhance their complex builds. They are taking advantage of simulation infrastructure that simply never existed before.”

  • Van Jones, Deal Lead at Wellington Access Ventures

“NativeLink has been instrumental in reducing our longest build times from days to hours and significantly improving our engineering efficiency. Its scalable infrastructure and robust caching capabilities have been game-changers for our team – we’re excited about the future possibilities with NativeLink.”

  • David Barr, Principal Engineer at Samsung
Continue Reading