Talking Autonomy,

Hardware to Enable Scalable Autonomy

See the lean sensor and compute stack that enables Ghost-equipped vehicles to drive autonomously.

By Ghost

August 3, 2023

minute read

Matt Kixmoeller:

Welcome to Talking Autonomy. In this episode, we're diving into the world of hardware, and specifically how we design hardware to enable all the software-defined functions that deliver autonomy here at Ghost. I'm joined today by Arpita. Welcome. Can you introduce yourself, tell folks what you do here at Ghost and maybe a little bit of what you did in your career before joining us.


Arpita Ghosh Dastidar:

Sure, at Ghost I work on hardware-software integration. I spend most of my time designing hardware and rest of the time getting software requirements to design the hardware. Before this, I was designing drones with Amazon for their autonomous delivery service, and before that I was making x86 servers.


Matt Kixmoeller:

All right, lots of cool, diverse projects. So you also brought some toys for us to play with, which I'm excited about. But before we check out what's on the table, maybe just give us a real high level view. How do we think about the hardware philosophy here at Ghost?


Arpita Ghosh Dastidar:

At Ghost we take an approach which is basically software defined hardware. What that means is we design hardware to run software functions, and that hardware is designed for scale to fit in any car, whether that be luxury end or the less than luxury end cars. This enables us to run different models and software on any car.


Matt Kixmoeller:

Yeah, I know the team is super demanding when they ask for a sensor stack that can fit economically in a $30,000 car as well as a $200,000 SUV.


Arpita Ghosh Dastidar:

Yes and we have to fit all those requirements.


Matt Kixmoeller:

That really changes everything in the company. When you start with a spec that says ‘this has to go in every car,’ it begets a certain hardware recipe and that hardware recipe is co-designed with the software because you need an approach to the software that can work on this level of hardware. It's just inherent in everything that we do to co-design these elements.


Arpita Ghosh Dastidar:

I think the key approach there is to have a more generalized approach to hardware. An open design then, vs. a very specific, specific design goal approach. Even though the second one gives you much potential efficiency, I think having a generalized approach is what has worked for Ghost and has made us very diverse.


Matt Kixmoeller:

I feel like this is a bit of a trend in the auto industry where the old approach was to have tons of ECUs all around the car, a specific processor (sometimes an FPGA, sometimes a dedicated ASIC), for each different function. What you're now seeing is a centralization and a realization that if we use central compute and do things in software, you can be much more flexible and adapt quickly.


Arpita Ghosh Dastidar:

Exactly, and for that we have to give up this notion of being exclusive and optimizing for one certain goal when we design hardware, we have to be applicable to all different technologies so our hardware can adapt to every different thing our amazing software team comes up with.


Matt Kixmoeller:

I imagine that occasionally the software team comes up with a new approach to something and they just assume it will work on the hardware.


Arpita Ghosh Dastidar:

We just have to provide enough compute for them to do that and make sure we have enough flexibility on our boards.


Matt Kixmoeller:

Makes sense, let's dive into looking at the stuff you brought here on the table.


Arpita Ghosh Dastidar:

Absolutely.


Matt Kixmoeller:

Another one of the unique approaches at Ghost is that we build hardware in a way that it could be something that goes in a real production car.


Arpita Ghosh Dastidar:

Right.


Matt Kixmoeller:

Often times when you look at autonomous projects you see all kinds of sensors strapped to the outside of the car. You see a trunk full of boxy computers, extra power, all that. In Ghost’s case, this is basically what we drive with, what you see on the table here.


Arpita Ghosh Dastidar:

And all of this costs under $1,500, and uses less than 150 watts of power. I think we have successfully accomplished that, and we are pretty proud of it.


Matt Kixmoeller:

Those are incredible stats, if you look at the gear that might go in a robo-taxi, it's probably not uncommon to see $30,000 or $50,000 worth of compute, sensors and. We're doing this with under $1,500 of hardware and 150 watts. And in our cars today, we just run off the standard alternator right?


Arpita Ghosh Dastidar:

Yeah, right.


Matt Kixmoeller:

All right, so first of all, this is the compute unit?


Arpita Ghosh Dastidar:

Correct, this is what we are driving off in our cars right now. What you see is this board here, that's inside. We have a battery system here inside as well to help us in situations when we are not drawing power far from the car.


Matt Kixmoeller:

Not to date myself, but this kind of looks like a CD changer that might have gone in my car in the eighties.


Arpita Ghosh Dastidar:

Absolutely, absolutely. And this is cooled by just generic fans as well. And it's under 100 watts.


Matt Kixmoeller:

Under 100 watts, for the entire compute complex! It looks like I see five processors on the board?


Arpita Ghosh Dastidar:

Yes, there are five processors and, and there's like flexibility for CAN transceivers so we connect to the car. Here's a little microprocessor that does all the power monitoring for this and sequencing. But yeah, we basically are running the car based on these five SOCs.


Matt Kixmoeller:

The processor is a Qualcomm 845. I mean this is literally a six or seven year old cell phone processor, right? It's smaller than the size of my thumbnail here!


Arpita Ghosh Dastidar:

And they are about seven to eight watts each.  This also has the flexibility to connect to cameras, and you have the cameras here. This is our 12 megapixel camera, taken from the mobile phone industry. This is just a Flex MIPI connection coming up from the cameras and you can plug it to any compute the way we've designed it and it's not dependent on what you have on the other end.


Matt Kixmoeller:

So this has a bit of a longer lens on it because we're looking far but that's essentially equivalent to the camera I might find in my cell phone, right?


Arpita Ghosh Dastidar:

Absolutely.


Matt Kixmoeller:

Since we're talking cameras, why don't you explain what this is.


Arpita Ghosh Dastidar:

This is our next gen camera, an automotive grade camera. An automotive grade camera has to withstand higher automotive temperatures, and not only that, this also has all the electronics built into it. All the serializing is done inside, so you just have one coax cable connected to your compute, and we have eight cameras on a car.


Matt Kixmoeller:

So we have four camera pairs, stereo in every direction. All right and then, this is our front window module?


Arpita Ghosh Dastidar:

Yes, this is our front camera assembly, which has a pair of the first cameras that I showed you. It also has this interior camera for monitoring the driver and understanding their intent.


Matt Kixmoeller:

It goes like this on the window, you can see the two cameras coming out.


Arpita Ghosh Dastidar:

Yes, it is designed to take the profile of the windshield.


Matt Kixmoeller:

So this gives us, what, 17 centimeters of separation for stereo disparity between the two cameras?


Arpita Ghosh Dastidar:

Yes, 17.1 cm.


Matt Kixmoeller:

Thanks for showing us all the hardware. Let's loop back around and talk a little bit more about the whole software defined design. Now, in the past, as we’ve said, the common approach was lots of smaller processors spread around the car. If you look, for example, at say a camera or radar, you would typically have a camera or radar and then maybe a chip right next to it to process its raw data. And then just small amounts of highly-processed data were sent to the centralized driving ECU. So we've turned that wholly upside down, right? How is the new design architected?


Arpita Ghosh Dastidar:

I think the key decision behind that is to offer flexibility. So, when we capture and send the computer raw data our neural net can like process the way it wants to, run different kinds of models and, that gives us flexibility and we are not tied down to whatever we get from the upstream sensors. The initial bottleneck, and that would be a challenge in every industry is compute compared to the efficiency of power. Once you nail that down and you can provide enough compute without using too much power, the I think the best decision is to get all the raw inputs to the big compute system so it can run neural networks on the maximum data. This way we have one master, or one master computer.


Matt Kixmoeller:

It does seem like this provides the ultimate flexibility.  And of course if we bake something into an ASIC that's going to be in a car for 10 years, it can never be changed. No one wants to run off a neural network from a decade ago in their car.


Arpita Ghosh Dastidar:

Exactly. That also means these peripheral sensors are easy to service and are more affordable and scalable than having individual ASICs for each of them in their assemblies.


Matt Kixmoeller:

I think the other thing that I've heard a lot from the automakers recently after the last few years is supply chain. Some of them got in a situation where one little ASIC on one little function stopped an entire production line because they couldn't get ahold of it.


Arpita Ghosh Dastidar:

Right, and these have to work for 10 years. That's the expected car lifespan in automotive.


Matt Kixmoeller:

So the last piece is that we’ve built real production-ready hardware, it's something that could literally go in a car.  But obviously since we don't design the car, we can't do the perfect integration where the camera might be in the B-pillar or the front camera might be integrated into the same assembly as the rear view mirror. So how do we work with automakers to take this design and work with them?


Arpita Ghosh Dastidar:

Our approach is to provide them with blueprints. From the blueprint they can choose how to customize things according to their specific vehicles, changing form factors and packaging. But the basic component design concept comes from us.


Matt Kixmoeller:

So our goal is to provide a reference architecture, but they're the experts at large scale manufacturing, integration, and auto grade testing out to all the extreme use cases.


Arpita Ghosh Dastidar:

Yes, and a lot of automotive electronics also have to go through compliance testing and automotive standards, auto makers have been doing that for years. So they integrate Ghost with their electronics in their system and hence they scale power and battery accordingly.


Matt Kixmoeller:

All right Arpita, thanks so much for giving this small window into the Ghost hardware and this is just our first-generation hardware. We're actually in the transition period right now to a whole new hardware architecture that is coming out. You can see the new camera there, but there's a lot more coming so we’ll look forward to talking with you again about that shortly when it is available.


Arpita Ghosh Dastidar:

Thank you Matt, thank you for having me here.

Talking Autonomy
Technology