Meet Chad Partridge: 375ai’s Chief AI Officer and Resident Chad

As part of an ongoing series where we shine light on our team, I had the pleasure of sitting down with 375ai’s Chief AI Officer Chad Partridge. We spoke about his trajectory from aerospace to transportation to edge artificial intelligence (AI), what makes 375ai unique, and more. As an entrepreneur with multiple successful acquisitions and 20 years plus of deep experience building at the intersection of computer vision, edge AI, sensors, and autonomy, Chad brings invaluable depth to an already deep team at 375ai. I hope you enjoy the conversation.
A lifelong quest to make sense of visual data
Michael Gushansky: Hey, Chad! Excited to have you as the first feature in our team profile series. Let’s start with your career path. You’ve had a really interesting journey from drone/robotics/autonomy research to now leading AI at 375ai. How did you get here?
Chad Partridge: Sure, Michael. At Stanford I was part of the Aerospace Robotics Lab. There, I mainly focused on Mesicopters (mini-drones) and computer vision projects requiring realtime responsiveness. Before that, during my time at Illinois, I built an air hockey playing robot powered by computer vision and nascent machine learning. I even worked on a juggling robot at Michigan. Those early experiences solidified my passion for developing systems that could make sense of visual data and interact with the world.
I was working with a group of guys at Stanford, and we decided to start a computer vision-based startup called Sensing Systems. We didn’t really know where everything would lead, but we had a repository of promising computer vision capabilities. We were encouraged to participate in the Small Business Innovation Research (SBIR) program, a government initiative supporting technological innovation. We applied for several SBIR solicitations, and one unexpectedly hit—a project focused on video compression and low latency video delivery for drones. We bundled our capabilities as the product Tungsten, which was a SDK for drone ground stations. Major companies had their own ground stations, but they lacked the fast, efficient video processing capabilities we offered. This gained traction quickly, and we ended up selling into multiple programs.
Meanwhile, while leading Sensing Systems, I became deeply involved with the Association of Unmanned Vehicle Systems (AUVSI), the world’s largest drone organization. I started as a Chapter President and eventually joined the international board where I served for 6 years.
Sensing Systems had become a significant player in the drone industry. In 2011, we received an acquisition offer from 2d3, a subsidiary of the UK-based Oxford Metrics Group, and they bought us. We became 2d3 Sensing, I joined the Executive Team, and it was a great success. We expanded from selling software toolkits to offering comprehensive video management and exploitation solutions with incredible 3D modeling and georegistration capabilities, which I’m very proud to have been involved with.
Ultimately, the company was acquired by Boeing, marking another successful exit. But at that point, I hit an inflection point and I wanted to lead something a bit more commercial so I pivoted. Around this time, the driverless car space was heating up, and I co-founded Metamoto, a virtual simulation platform for autonomous vehicles. The idea was to create a massively scalable testing environment to accelerate innovation in autonomous technology. While the concept was solid, the market was perpetually emerging and investment became siloed around a few major players. Despite having over 30 customers and many successes, Metamoto’s future was to be acquired. We pursued active acquisition interest, which was going great until COVID hit. That time period was tricky to say the least. We eventually sold the company to Foretellix in 2020.
After Metamoto, I took time to reflect on my career and lessons learned. During this period, I served as an entrepreneur-in-residence at Titletown Tech, the Green Bay Packers’ venture organization, and as a Future Founder at Menlo Ventures. Ultimately my conclusion was that what I really wanted to do was to create an edge AI solutions company, very akin to what I had done early on with drone video. In this process of doing that, Trevor Branon [375ai COO] got wind of all the things that I was doing and had done, and was like hey, I'm working on this thing, and I think you could be a perfect fit to work with us on this.
My plan was that I was just going to do this as a contract but I got to know Trev, Rob and Harry, their back story and what they all achieved separately and together, for instance, at Linksys. I was really impressed by their plans to turn infrastructure like billboards and rooftops into edge processing and analytics centers and decided to stay with the team long term. 375ai is essentially provides the ultimate computer vision and sensor centric edge AI suite. It pretty much collects much of my past history into one consolidated sector and it's perfect for me.
Why 375ai?
Michael Gushansky: I know you met Trevor playing tennis, which is another passion of yours, and you were already working on your own edge AI solutions, but what drew you specifically to 375ai?
Chad Partridge: One of the new frontiers in AI is multi-modal AI and I was looking for rich applications in multi-modal AI. You can say hey, I'm an edge AI specialist, and I can do all these things, but if you don't have access to relevant data, you can't do much. When I was in drone land, I had access to all types of drone feeds. When I was in driverless car land, I was able to create virtual environments to simulate data. Now I'm in this new world, and if I'm going to do edge AI, I need rich data sources with a distinct business purpose or I’m limited. What really makes 375ai unique is that it has exactly the type of partnerships and multi-modal data sources that I was interested in working on. The second thing is I really wanted to use powerful NVIDIA Jetson and Deepstream capabilities and find unparalleled applications for them. 375ai is ideal for that. The leadership team’s ambition and their background—like what they achieved for instance at Linksys—sealed the deal.
Michael Gushansky: We talk a lot about multi-modal data. Can you explain what that is for those unfamiliar with the term? Why is it important?
Chad Partridge: Absolutely. Multi-modal data involves collecting diverse types of information—like video, audio, location, and environmental metrics—and integrating them to provide richer context. For example, at 375ai, we’re collecting video feeds, audio, and other sensor data to create a holistic understanding of environments. This kind of data is essential for real-world applications, from transportation analytics to urban planning, because it gives us a comprehensive view that single data streams can’t provide.
Next generation edge AI infrastructure
Michael Gushansky: What’s unique about the technology behind 375ai’s edge devices?
Chad Partridge: Our devices are completely cellular-connected. You don’t have to run fiber to these sites or anything like that which makes them very very portable and scalable devices. They’re equipped with cameras, audio sensors, and more. What sets our edge devices apart is their edge computing capability. Instead of sending raw data to a central server for processing—which is inefficient and often expensive—we process data right on the device. This approach reduces bandwidth usage and ensures we’re only transmitting high-value insights, not terabytes of raw video.
On top of the base capability of the device, what makes the device really special is that it's a platform. We have a ton of partners coming to us that are looking to layer all types of Decentralized Physical Infrastructure Network (DePIN) apps and sensors on top of the platform. For instance, we have gotten requests from drone companies that want to use our platform/locations for transponder ADSB sites for locating drones. We are pairing capabilities like GEODNET and Helium to extend our insights and reach. Ultimately to me, edge processing in itself is entirely cool but the fact that the edge device is also an expandable platform for other real world DePIN applications I think is equally impressive.
Michael Gushansky: So essentially, the paradigm has been, if you put video cameras and sensors at the edge, all of that raw video and sensor data had to be compressed and sent to some data center for processing, and that bandwidth wise is very prohibitive because it’s very expensive and not scalable. The innovation here is that we're taking all of this data not just sending it out for processing at a data center, but using the compute and AI we put inside the device, we're able to actually process the data at the edge where it's being collected and provide real-time insights.

Chad Partridge: That’s right, and it goes back to my drone days too. They would have all these drones, and they were streaming the video, storing it all, and a lot of it is useless. Digging for information in these archives is like finding a needle in a haystack. You're looking at all this video, but what you really care about is this very small instance in time that something interesting happened, and that's what we're doing. Our cellular pipe is pretty thick but we’re continuously trying to lessen how much of it we actually need, and using it just for essential stuff that happens.
Whereas we're not taking any personally identifiable information, we are creating generalizations of these types of data and we're able to distill these things down into JSON files, for instance, that we're sending to our back end. And that's really low bandwidth stuff, high value data, pretty cool.
Michael Gushansky: We’ve talked about your trajectory and 375edge. What excites you most about working at 375ai?
Chad Partridge: The potential is immense. We’re at the forefront of edge AI, creating solutions that have endless applications. Whether it’s helping advertisers understand the value of their billboards, improving urban infrastructure, providing rich data sets for training, or unlocking new insights from multi-modal data, the opportunities are limitless. Plus, the team’s ambition and our partnerships, especially with Outfront Media, give us incredible reach and resources.
Coach Chad
Michael Gushansky: Sometimes it is hard to believe but we do other things outside of work. What are you doing for fun outside 375?
Chad Partridge: I’m the captain of my tennis team—though Trevor’s the better player! I also coach my kids’ sports teams and serve as the Treasurer of our local youth baseball association. I’ve coached about 30 plus teams so far. Between tennis, coaching, and family time, I stay pretty active.
.jpg)
Michael Gushansky: Last question: What’s a fun fact about you?
Chad Partridge: I’ve escaped from Alcatraz… twice! Not literally—I’m talking about the Escape from Alcatraz triathlon. I used to do a lot of triathlons and marathons, so that’s a fun part of my past. Now I just barely keep up with my wife at our local Basecamp Fitness.
Michael Gushansky: Any parting thoughts?
Chad Partridge: I just think that as a team like this, with very ambitious leadership, we’re already introducing three different product lines and talking about expanding. The Outfront partnership gives us endless reach, but we're not stopping there. I have this feeling that this is a business that can just become edge AI everywhere and that’s very, very powerful. There’s no end to the applications that we can do with all of this. And I think that we have the type of team that is willing and able to take the plunge in all of them.