Skip to page content

Exclusive: Computer-vision startup aims to change how machines perceive the world


Ubicept
Tristan Swedish and Sebastian Bauer are the co-founders of Ubicept.
Ubicept

A startup with its eye on revamping computer-vision technology has raised new funding. 

Today Ubicept announced it has raised an oversubscribed $8 million seed round, led by Ubiquity Ventures and the MIT Media Lab's E14 Fund. Other participating investors included Wisconsin Alumni Research Foundation, Phoenix Venture Partners, and several other investors and angel contributors.

Ubicept says its new technology, unlike conventional camera systems, can capture sharp images in extreme lighting conditions or high-speed motion. The startup was founded in 2021 and is led by CEO Sebastian Bauer and CTO Tristan Swedish, previously of the University of Wisconsin-Madison and MIT, respectively. Ubicept has a bench at The Engine and office space in Boston. 

Replacing “still frames” with single-photon sensors

Bauer used rain as a metaphor for describing how Ubicept's new computer vision technology differs from traditional forms. Light is composed of individual particles, called photons, Bauer said. He likened photons to raindrops. In traditional sensors, Bauer said pixels act as buckets that collect the rain drops over the exposure time.

“A certain amount of water volume accumulates. When you have a bright pixel, it’s a lot of water. If it’s a dim pixel, it’s not so much water volume,” Bauer said.

The problem these sensors run into is buckets that overflow with rain drops, or buckets that only get a few drops of water. Too much light creates white blotches in images and too little light leaves dark patches, Bauer said.

Most sensors used today also have trouble capturing things in motion, Bauer said. When raindrops try to fall into a bucket while moving, they spread out across multiple buckets. This can create a blur in images.

Ubicept’s technology differs in that it captures each individual photon with a time-stamped resolution. 

“It’s not really a bucket anymore, but once the photon crosses through that area, it is time tagged with extremely high time resolution,” Bauer said.

By capturing photons this way, Bauer said it enables them to create much better image quality, which they believe will replace a good amount of conventional cameras and sensing in the next few years.

To demonstrate their single-photon processing software, the Ubicept team constructed a miniature village scene on a pottery wheel on their bench in The Engine. The display sits inside a box covered in blackout curtains with a light in one corner to simulate different light levels, Swedish explained. As the display spins on the pottery wheel and shadows dance across the box as the light levels fluctuate, the team captures the scene on their camera — which uses their single-photon processing software — and a conventional low-light camera. The difference in imagery is evident. 

The low-light camera captures blurry imagery as the pottery wheel picks up speed and has trouble showing objects in high or low light. Ubicept’s technology provides a much cleaner picture of the scene. Watch a video of this experiment below.

Swedish said this setup is matched to what automotive camera settings would be for collision detection or automated braking assistance. 

“If something’s moving too fast, it blurs and you can’t see the pedestrian or you can’t see the car. This is just kind of like a known problem for automotive cameras, but you can take this use case in automotive and apply it to basically any time you’re using a camera to try and perceive the world,” Swedish said. “Since we capture (light) in a fundamentally different way, we can then present it in a way that’s kind of the best way possible for object detection.”

Mobility and beyond

Calvin Chin, managing partner at E14 Fund, said Ubicept’s work is so critical because it provides high-quality information in the mobility space, which can have life or death ramifications. The venture capital fund is affiliated with the MIT Media Lab and was founded in 2013 to support MIT's deep-tech spinoffs.

AI and machine learning are all the rage right now. Chin said it’s important that we have these “powerful machines,” but we also need to think about giving them the highest quality data.

“When it comes to sensing like Ubicept, this is like the limits of physics. You’re counting every single photon, so you can imagine that’s just going to provide the best possible information for all the AI to make important decisions,” Chin said.

Getting new technologies accepted into the automotive industry is a long, difficult process, Swedish said. For now, Ubicept is focusing on how its technology could be used in smaller specialized vehicles, industrial automation and 3D scanning. This expansion will be fueled by their new funding. The company has a demo kit it is testing with customers. The next step is putting that technology into a product with these customers, Swedish said.

“It’s really exciting. We’re getting a lot of interest. But it also is important with every funding round that we can deploy this in use cases that will be in products as soon as next year potentially,” Swedish said.

Ubicept is also growing its team with this new funding. The company has four full-time employees in addition to its co-founders and is hiring programmers and people with computer vision and graphics experience.


Sign up for The Beat, BostInno’s free daily innovation newsletter. See past examples here.



Keep Digging

News
Fundings
News
News
Fundings


SpotlightMore

See More
See More
See More
See More

Upcoming Events More

Jun
14
TBJ

Want to stay ahead of who & what is next? Sent daily, the Beat is your definitive look at Boston’s innovation economy, offering news, analysis & more on the people, companies & ideas driving your city forward. Follow the Beat.

Sign Up