Self-driving car on the road Pretty by Lu-motive new company

Self-driving car on the road Pretty much every , uses lidar to sense its surroundings,not to mention many a robot and drone. Lu-motive is a new company with funding from Bill Gates and Intellectual Ventures that uses meta materials to exceed those limits, perhaps setting a new standard for the industry.But useful as lidar is, it also involves physical compromises that limit its capabilities.
The company is just now coming out of stealth, but it’s been in the works for a long time. If the terms “meta-materials” and “Intellectual Ventures” tickle something in your brain, it’s because the company has spawned several startups that use intellectual property developed there, building on the work of materials scientist David Smith. I actually met with them back in 2017 when the project was very hush-hush and operating under a different name at IV’s startup incubator.
Meta-materials working as a single device are essentially specially engineered surfaces with microscopic structures — in this case, tunable antennas — embedded in them.

Echodyne is another company that used meta-materials to great effect, shrinking radar arrays to pocket size by engineering a radar transceiver that’s essentially 2D and can have its beam steered electronically rather than mechanically.
The principle works for pretty much any wavelength of electromagnetic radiation — i.e. you could use X-rays instead of radio waves — but until now no one has made it work with visible light. That’s Lumotive’s advance, and the reason it works so well.
Flash, 2D and 1D lidar
Lidar basically works by bouncing light off the environment and measuring how and when it returns; this can be accomplished in several ways.This provides a quick measurement of the whole scene, but limited distance as the power of the light being emitted is limited.Flash lidar basically sends out a pulse that illuminates the whole scene with near-infrared light (905 nano meters, most likely) at once.

Focusing the power into a beam gives these systems excellent range, but similar to a CRT TV with an electron beam tracing out the image, it takes rather a long time to complete the whole scene. 2D or raster scan lidar takes an NIR laser and plays it over the scene incredibly quickly, left to right, down a bit, then does it again, again and again… scores or hundreds of times. Turnaround time is naturally of major importance in driving situations.
1D or line scan lidar strikes a balance between the two, using a vertical line of laser light that only has to go from one side to the other to complete the scene. This sacrifices some range and resolution but significantly improves responsiveness.
Lumotive offered the following diagram, which helps visualize the systems, although obviously “suitability” and “too short” and “too slow” are somewhat subjective:
they rely on a mechanical platform to actually move the laser emitter or mirror from place to place this is the main problem with the latter two. It works fine for the most part, but there are inherent limitations. For example, it’s difficult to stop, slow or reverse a beam that’s being moved by a high-speed mechanism. If your 2D lidar system sweeps over something that could be worth further inspection, it has to go through the rest of its motions before coming back to it… over and over.
In Echodyne’s case the radar could quickly sweep over its whole range like normal, and upon detecting an object could immediately switch over and focus 90 percent of its cycles tracking it in higher spatial and temporal resolution. This is the primary advantage offered by a metamaterial system over existing ones: electronic beam steering. The same thing is now possible with lidar.
Every millisecond counts because the earlier a self-driving system knows the situation, the more options it has to accommodate it. Imagine a deer jumping out around a blind curve. All other things being equal, an electronically steered lidar system would detect the deer at the same time as the mechanically steered ones, or perhaps a bit sooner; it could not just make more time for evaluating it on the next “pass,” but a microsecond later be backing up the beam and specifically targeting just the deer with the majority of its resolution upon noticing this movement,.
Just for illustration. the beam can still dedicate a portion of its cycles to watching the road, requiring no complicated mechanical hijinks to do so The beam isn’t some big red thing that comes out.
Targeted illumination would also improve the estimation of direction and speed, further improving the driving system’s knowledge and options — meanwhile. Meanwhile, it has an enormous aperture, allowing high sensitivity.
it depends on many things,In terms of specs, but if the beam is just sweeping normally across its 120×25 degree field of view, the standard unit will have about a 20Hz frame rate, with a 1000×256 resolution. In the example of the deer, it may maintain a 20Hz refresh for the scene at large but concentrate more beam time on a 5×5 degree area, giving it a much faster rate.That’s comparable to competitors, but keep in mind that the advantage is in the ability to change that field of view and frame rate on the fly.
Meta doesn’t mean mega-expensive
Pricing is still a ways out — Lumotive just wanted to show that its tech exists for now — but this is far from exotic tech. Naturally one would assume that such a system would be considerably more expensive than existing ones.
their engineering process was tricky specifically because they designed it for fabrication using existing methods. It’s silicon-based, meaning it can use cheap and ubiquitous 905nm lasers rather than the rarer 1550nm, and its fabrication isn’t much more complex than making an ordinary display panel The team told me in an interview that .
CTO and co-founder Gleb Akselrod explained: It’s made using a standard semiconductor process, then we add liquid crystal, then the coating. It’s a lot like an LCD.”An additional bonus of the metamaterial basis is that it works the same regardless of the size or shape of the chip “Essentially it’s a reflective semiconductor chip, and on the surface we fabricate these tiny antennas to manipulate the light.While an inch-wide rectangular chip is best for automotive purposes, Akselrod said, they could just as easily make one a quarter the size for robots that don’t need the wider field of view, or a larger or custom-shape one for a specialty vehicle or aircraft.
The details, as I said, are still being worked out. “We spend an inordinate amount of time explaining the technology to investors,” noted CEO and co-founder Bill Colleran. Lumotive has been working on this for years and decided it was time to just get the basic information out there. He, it should be noted, is a veteran innovator in this field, having headed Impinj most recently, and before that was at Broadcom, but is perhaps is best known for being CEO of Innovent when it created the first CMOS Bluetooth chip.
Right now after running on a 2017 seed round funded by Bill Gates and IV, the company is seeking investment which (as with other metamaterial-based startups it has spun out) is granting Lumotive an exclusive license to the tech. There are partnerships and other things in the offing, but the company wasn’t ready to talk about them; the product is currently in prototype but very showable form for the inevitable meetings with automotive and tech firms.

Get Amazing Stories

Get great contents delivered straight to your inbox everyday, just a click away, Sign Up Now
Email address
Secure and Spam free...

Get Amazing Stories

Get great contents delivered straight to your inbox everyday, just a click away, Sign Up Now
Email address