The Road to Autonomy Autonomy Markets Autonomy Economy AUTONOMY Leaderboard
Oshkosh Autonomous Dump Truck - The Road to Autonomy

The Age of Physical AI: Inside Oshkosh’s Blueprint for an Autonomous Future

Executive Summary

Jay Iyengar, EVP, Chief Technology and Strategic Sourcing Officer, Oshkosh joined Grayson Brulte on The Road to Autonomy podcast to discuss the company’s approach to Physical AI and industrial automation. By focusing on “moments of autonomy,” Oshkosh builds purpose-built solutions that increase safety and productivity for the everyday heroes who build, serve, and protect communities. The conversation explores real-world applications ranging from automated airport jet bridges to the HARR-E refuse collection robot.

Key The Road to Autonomy Episode Questions Answered

What does Oshkosh mean by “moments of autonomy”?

Oshkosh defines “moments of autonomy” as applying autonomous technology specifically where it adds the most value (like repetitive or hazardous tasks), rather than attempting to fully replace a human being in all situations.

How is Oshkosh applying Physical AI to airports?

Oshkosh is deploying automated technology to handle jet bridge docking, utilizing systems that identify the aircraft door and maneuver the bridge to within inches of the plane, which reduces productivity loss and lowers the risk of damage.

What is the HARR-E robot?

HARR-E is an autonomous robot created by Oshkosh for on-demand refuse collection in planned communities, university campuses, and stadiums, serving as a disruptive alternative to traditional, weekly garbage truck routes.

The Road to Autonomy Topics & Timestamps

[00:00] Moments of Autonomy Philosophy

Jay Iyengar explains Oshkosh’s mission to build technology for “everyday heroes” and introduces their core strategy of targeting specific “moments of autonomy” to assist workers, rather than trying to fully replace human operators.

[04:45] The Jet Bridge Bottleneck

A look at the hidden inefficiencies of air travel, detailing the complex training, weather delays, and significant productivity losses associated with manually docking jet bridges.

[07:20] Deploying Physical AI at the Gate

Oshkosh is solving gate delays by deploying physical AI that automatically identifies the aircraft door and maneuvers the bridge to within inches, leaving only a minor final adjustment for the operator.

[10:45] Navigating Tarmac Chaos and Regulations

A realistic discussion on the hurdles of tarmac autonomy, including strict aviation regulations, the dynamic nature of airports, and why Oshkosh focuses on low-hanging fruit to add immediate value.

[14:15] Blueprint for the Airport of the Future

Jay outlines a broader vision for airfield automation, including autonomous cargo loaders, ground support equipment, automated chalking bots, and perimeter security robots.

[16:05] The Data Moat & Oshkosh’s AI Stack

An inside look at Oshkosh’s technology stack, emphasizing why the integrity of proprietary training data and strict cybersecurity measures are their biggest competitive advantages.

[19:30] Weighing Trash with AI Side-Loaders

How physical AI is transforming waste management by allowing refuse trucks to automatically locate curbside bins, deploy the arm, and instantly weigh the contents.

[21:30] Meet HARR-E: The On-Demand Trash Robot

Introducing HARR-E, a robot that replaces loud, weekly garbage trucks with quiet, on-demand refuse collection for planned communities and university campuses.

[26:30] Revolutionizing the Postal Delivery Fleet

Exploring the future of logistics, including concepts where next-generation delivery vehicles act as mobile depots that deploy smaller autonomous robots into neighborhoods.

[28:15] Why You Shouldn’t Over-Engineer Sensors

Jay breaks down Oshkosh’s hardware approach, explaining why it’s critical to match specific sensor and compute suites to harsh industrial environments (like construction dust) without over-engineering the solution.

[30:30] The Hidden Power of Strategic Sourcing

How combining the CTO role with strategic sourcing gives Oshkosh a massive advantage in securing vital tech partnerships, managing supply chains, and controlling the costs of GPUs and sensors.

[32:20] Level 5 Military Learnings

The technical breakthroughs and rigorous validation strategies Oshkosh has transferred from its highly classified defense leader-follower autonomous programs into the commercial sector.

[35:10] Waiting for Physical AI’s ChatGPT Moment

Jay agrees that physical AI is approaching a massive tipping point, predicting a future where industrial vehicles are so intuitive that operators won’t even realize they are using advanced autonomy.

[36:30] The Next 100 Years of Oshkosh

A final reflection on Oshkosh’s legacy and its ongoing mission to keep the world moving forward by transforming the equipment used by those who build, protect, and serve communities.

Full Episode Transcript

Grayson Brulte: Jay, it’s great to have you here on The Road to Autonomy. Oshkosh is world renowned for your design and your engineering prowess. The world’s changing, it’s going automated, and everything’s becoming autonomous. How is Oshkosh as a company currently thinking about autonomy?

Jay Iyengar: Thank you for having me. Really excited to talk to you about, uh, you know, the impact Oshkosh has, uh, overall. So maybe a quick introduction to Oshkosh. Then I’ll talk specifically about the autonomy question. You know, we are an industrial technology company. We, uh, provide products, uh, for, you know, for the everyday heroes, those who have the tough jobs in our neighborhoods, those who build, you know, serve the communities and protect the communities such as, you know, your construction workers working at height, your, uh, your refuse collection teams that actually come and maintain your neighborhoods. Fire, uh, you know, you know, firefighters and soldiers and people that work at the tarmac of, at an airport where they work under all kinds of weather conditions in a dynamic environment. So those are the heroes that we talk about and the world doesn’t move on without them. So we talk about technology for the everyday heroes, um, and we are an industrial tech company that kind of always looks at the end user and makes sure that our products are, are. Or serving the needs and making them safer, more productive, et cetera. So on the autonomy specific question, um, you know, we talk about autonomy, uh, not, we think of autonomy not so much as something to replace a human being. That, that, that is really not, not the, the, the main, main, uh, vision of it. We talk about, you know, um, moments of autonomy, which means that. Um, applying autonomous technology where it adds the most value, um, rather than pursuing, let’s just make it autonomy under all situations. Um, our vehicles are complex. Uh, they are, you know, purpose built complex, and they’re all there to do a job. They’re not there for someone to go from a place A to a place B. They’re people’s offices and there’s a lot of complex controls that are involved as you get into these vehicles. are as, as you think of the, a day in the life of, you know, one of our, our our, uh, heroes. Um, there are, um, quite often repetitive tasks. Uh, many times it’s hazardous, you know, it’s not the most safest place to be. And there are quite repetitive and there’s a lot of driver fatigue too, and the productivity, some precision requirements of, of the, what they do. You, um, in all of those cases, if you just apply autonomy in a targeted way, there is a tremendous amount of value to be had. Um, in, in, in terms of, you know, we are a B2B, our direct customers can have total cost of ownership benefits. Clearly, the end user has benefits in terms of safety, productivity, uh, it just makes, makes the entire value chain extremely efficient and safe. So we talk about moments of autonomy all the time. I can give you the very specific examples of what I mean by that, but that’s kind of how we think of autonomous solutions. And, and this applies across all our products. I can’t think of one product where the moments of autonomy doesn’t apply.

Grayson Brulte: When you think about the moments of autonomy, I, I love the line everyday heroes. How much of it is trying to unlock productivity gains and improve in safety as you’re developing and, and even going into the lab of coming up with these new ideas for products.

Jay Iyengar: It is majority of, so we do a very comprehensive voice of the customer, and because of who we are, we are, uh, very close to a customer. Customer meaning. Not just our direct customer, the operators of our vehicles, we, so we are, we understand the pain points, um, that they face. Um, so we go through a, a very, uh, comprehensive voice of the customer. Whether we spend time with our direct direct business customers or the operators of our vehicles. So we map what, what are the things that are. That are, uh, inherently inefficient or unsafe about the work that they do, particularly repetitive tasks and the driver fatigue, injury, et cetera, those types of things. We use that to prioritize where the autonomy can add benefit, or where technology can add a benefit, where the autonomy in particular can add benefit. Maybe I, I, and, and in, in so many place ways it is possible to quantify it. It isn’t just a, just a feeling. You can actually quantify it. We can actually dollarize the benefits, dollarize the issues that they have. One example I could give you is, um, jet Airports. When you, when you land and your, uh, jet bridge. Your jet bridge that you need or that, uh, you, first of all, many times you wait for a jet bridge operator to show up. Um, and then, and then it’s a, I don’t know if you’ve ever watched, it’s a very complex thing to move. You’re moving a, a entire building closer to a millions of dollars of aircraft. So they’re very nervous, uh, to, to really make sure that they’re, they’re not, you know, overindexing and they’re aligning properly. Within a gate, there is different sizes of aircraft come in, so doors are in different locations too many times for them to align too. And then it takes a lot of time for them to get trained. It, you know, it’s, the training of the operators is actually quite, quite, you know, quite, um, tedious as well for them to really get good at. So you lose time in the process. Many times you lose time because they take a longer time to longer time to dock many times because. They’re not there. There is some, some, uh, and many times because there’s an inclement weather outside that you can’t do it. So every one of these things, there’s a significant productivity loss, uh, not to mention the impact it has on the passengers that fly. So these are the kind of things we prioritize. We work very closely with them. It’s not just about what we do in our labs. I mean, what we do in our labs is the easier bit. It’s really to understand and, and to really deploy technology. And then the last piece I would say is we, um, we, we work very closely with our customers to deploy technology early in, in, in the life cycle so that we do get the feedback saying Don’t do this and do this refinements that are required on these things. As an example, we have, you know, fire, you know, our next generation will tear fire trucks in, in, in, in, uh, in the hands of our customers. And there are daily routes. We monitor seeing how they’re doing. So that’s, that’s where we. We further tune the product to make sure that it’s really meets the environ, you know, the actual requirements of use cycle, so that’s how we think about it.

Grayson Brulte: Tuning’s. Top of mind. Now everybody talks about tuning models, but you’re tuning if you want to use the the, the buzzword now, physical ai, you’re tuning it there. I’m curious from an airport bridge, we’ve all been stuck. I’ve been stuck, I think an hour once waiting for the individual to move the bridge and. Let’s just be very honest here. People on the plane get a little feisty after waiting a long period of time when they’re after a long flight. ’cause all they want to do is get off the plane. How clo, how close are we to automating the airport bridges?

Jay Iyengar: We are very close to automating. The technology is already here. I could tell you that, um, the technology actually identifies where the door is and then, and then, and then it, it moves. I, I talk about the moments of autonomy. Uh, what all it needs to do is get very, very close to the door. You know, maybe a last. Five inches from the door, move the bridge, adjust the bridge its height, and get very, very close to the door. The last bit of the movement can be done by a human being. It’s very easy to move the last, last in last few inches of it, so, so we are actively deploying Judd Dark technology across, and there’s always further enhancements you could do. Technology isn’t just the one thing, it’s, it’s a, it’s a, it’s beginning of a platform. You can add more and more things to features and functions to it as, as it, uh, as you evolve. It’s all, it is all physical ai, you know, you saw a lot of AI at CES. It is all physical ai. It’s actually in, in the, with, at the point of impact, making an impact again. We don’t have to take humans out of these things. It’s matter of making. So any in the, let’s, let’s imagine the scenario where a jet doc can get you there, you know, five inches from the, from the door, which means that, and the last, last bit of human being does that, which means that they don’t have to go trained, still wait for the skilled jet bridge person. They can, they’ll be more flexible in how they can deploy resources. And clearly there’s a precision involved there. The risk damage, risk of damage to the aircraft is pretty much virtually minimized with, with, with this type of technology. So there’s a lot of other benefits that we, that we, that happens. Uh, you know, and time is money, right? And, and then anytime your passengers can get off. So we have very specific time targets from, for the, for the, for the jet dock operation. So that’s what we designed too, just to make sure that, um, and then. You can always do things like you would have cameras in, in, within, within your, I’m taking you to the next level in, within your jet bridges, and you could see passengers coming on or off. Did the cleaning crew go in? Did the operations get done? There’s things are possible to go beyond just docking, docking, the bridge, if you will. Oh, conditions inside the bridge. Uh, you know, there’s air conditioning, there’s temperatures, there’s lot, lots possible. As you start to build this platform, you know, the future of what you could technology can do is, is pretty, um, is, is is very, very compelling cases to, uh, to really, really add value to the, to and, and make things more productive in the world.

Grayson Brulte: There’s a lot of ways to add value, and I’m going to follow your line of thinking and take it. One step further, as you identified the labor shortage, because of the, the training and certifications have to go into it. Could we get to a, a hypothetical scenario in the future where perhaps one of the stewardess or the pilot could do the last couple inches if they have the ability to see the camera to do it? ’cause that way you eliminate, oh, we have to wait for the person. Oh, that’s gonna be 10 15. Oh, could we eventually see it be done from inside of the aircraft?

Jay Iyengar: Theoretically anything is possible from a technology would allow you to do that. So it’s a question of what are all the requirements and regulations that come into it. There’s a, there’s a lot of these. I’m very humbled by the, by the environment that the, you know, that, uh, that at, at the aircraft, you know, at, at the tarmac. There’s, there is a lot that happens. We don’t, we just take it for granted that the plane lands and, uh, and then somehow we, we, we’d like to get off very quickly. So there’s a lot of variability and a lot of dynamic environment that happens. So, um, yes, I mean, today I think I’ve seen tech, there’s technologies where, um, the. The controlling the tug vehicle, uh, for, for, you know, you know how the aircraft gets tugged in that happens from within the aircraft, uh, many times when it’s tugged in. So there’s, there’s things such things are possible, the complexity of where all the aircraft’s gonna go. And, you know, we have to think through the whole value chain, you know, before just deploying a technology just to make solve one, uh, one specific problem. We gotta make sure, think of the bigger picture. So, so technology isn’t stopping it. I don’t believe what you are asking. It’s a matter of what’s the practical, pragmatic way for, for doing this, and is that the significant value? And I believe the low hanging fruit right now is making it a lot easier for the, for the operator training and not have to depend on the training. That’s a, that’s a very first step. It’s low hanging fruit. I see a lot of these types of low hanging fruits the technology can do. And that’s what we’re going after in Oshkosh. Just get through some of these things in the next few years, you know, you can always keep adding more to it.

Grayson Brulte: And that’s the beauty about physical ai. You can keep adding on, or I’ll use the child’s term Allego block. You can always put an additional Allego block onto it. And then you know this very well, and a lot of this audience knows it when you’re dealing with. Airports and when you go on tarmac, different sorts of regulations that have to be abided by and different ways the processes have to be done. The technology has to be developed. Is Oshkosh looking for further more on tarmac operations? Are you staying pretty close to, if you want to call it the, the hub where the passengers, the terminal, how are you thinking about the overall on tarmac autonomy strategy?

Jay Iyengar: So we, we are very much looking at, uh, the whole, um, aircraft airport of the future would include, um, you know, moments of autonomy around the, around the tarmac. You know, we do pro our ground support equipment that we. Provide today, whether it’s UG vehicles or whether it’s uh, you know, tractors for, uh, cargo loaders, tractors, et cetera. , Uh, those are, um, that’s where we see also an opportunity. I know we talked a lot about the jet bridge, but even there, there’s a lot of opportunity there because people aren’t allowed to go out and, um, by the way, it takes three people to land an aircraft, right? I mean, to, to park an aircraft. Um, you need the two wing walkers. You really need, need the person up front to, to get into there. And then you got, you need, you need to chop the aircraft. The aircraft isn’t chalked, nothing can happen. They can’t open the door. Nothing can happen until the aircraft is chalked. Those operations, if you think about it, uh, it takes three people to kind of guide the, guide the aircraft that can be very easily automated. So we showed the idea, we showed the vision of, uh, you know, a, a, a robot, uh, a robotics platform. That based off of, uh, you know, something that we’ve done in defense application, um, where, uh, it can work under harsh conditions. It could be, and obviously it needs to know where it is and kind of guide the aircraft. It, it needs to know what, what type of aircraft it is and guide the aircraft appropriately. We see that. We see cargo loaders, uh, if, uh, cargo loading can be done by a human being, but the cargo tractors always move from a place A to a place B. They go back and forth between the aircraft. And where it, where it gets, uh, you know, unloaded. There’s no reason why that can, that, that cargo, cargo, um, load is going back and forth, cannot be automated in that particular geo-fenced area. Um, cargo loading, we are automating it. Cargo loading again, coming in it, it’ll come get you as close as possible to the aircraft. And the last few inches can be done by a human being the same idea as a jet dock. So. Chalk bots, uh, you know, clearly chalking, chalking at the could be done. Uh, so we showed all of these things at our vision of the airport of the future. Um, tremendous amount, oh, uh, perimeter robots for security checks around, around the airfield. Um, looking for any foreign objects, looking for an dead animals, looking for all of those types of things. Overall, security. security. robots are extremely, extremely value add. They’re all, it’s not just about robotics. When I use the robot, robot, it’s ai, it’s physical ai, you know, cameras and systems actually see what’s going on and interpret what’s going on and, and actually make decisions on, on these things. It’s all obviously connected solutions and AI based. So a lot of opportunities, it’s, it’s, it’s very fertile ground for autonomy and creating value with autonomous solutions.

Grayson Brulte: And then below that is the models that power on the the physical ai. How are you a CTO at Oshkosh? How are you thinking about models? Are you developing models in house or are you using off the shelf models and building your own proprietary layer on top of it? How are you thinking about all these different models? Because. You have an incredible wide variety of business. You have a defense business, you have your airport business, you have your civilian business. How are you thinking about that, especially as it relates to models? +4

Jay Iyengar: So we, we do have a, uh, well, um, thought through technology stack. As we, as I talk about the, talk about deploying autonomy as an example. It is, it is really about what we do pay attention to is that, is how do we train the models, the data sets used to train the models. Is where we, um, we are very, very particular about using the right data sets, uh, that we, that are proprietary, that we control. Because these are proprietary applications. It’s not the models themselves. There are times we actually use models that are, that are, you know, ready, commercially available, but it’s, it’s, it’s a training of the data sets and then, um, the protection of the data information. Do you actually. Um, call it kind of cybersecurity product cybersecurity. Do you actually, um, save all the data? Do you save all the insights? Uh, generally making sure that we are collecting what is required and no more, and we are saving what is required and no more. so so just make sure that, so that’s where we pay a lot attention to. It’s really the integrity of the training, of the data sets. In my view is more important that, that that’s what we pay more attention to. That’s where we provide the most, most safety of a, of, of the AI systems.

Grayson Brulte: You have to, I mean, you can make the argument. A lot of fame, venture capitals would make it, everybody has all the data, but you, the proprietary data’s the moat. And you, you’re, you’re right about that. ’cause proprietary data can allow you to do things. Do you generate that data from your various test courses, your, your test ranges? Or do you put things out into the field? How are you generating. That proprietary data to ensure that it stays with Oshkosh.

Jay Iyengar: We collect our own data as, as, as, as we are going through our, our own comprehensive testing. And, and, and many times we work very closely with the customer. So, so it’s, uh. Depending on, and we anonymize the data of course, as, as we are going through it, but it’s so, um, there’s no substitute for, um, looking and feeling right there at, at the, we do, we do initially collect all the data with what we need, you know, with our own test scores, both in our labs and, and test courses. And also, and many times customers give us access to their facility for us to do our own testing as well. So that’s why that’s another source of information. It depends on the, it depends on the product you’re referring to.

Grayson Brulte: But then that’s under hinging. All that is is great partnerships and great relationships. When you have great partnerships and great relationships, you can do really good things and, and you have the partnership with the Department of War. Are you deploying any of these models in airtight environments?

Jay Iyengar: Particularly in, uh, in our defense applications, but I don’t, I can’t talk about that. Just, you know, um, there are applications that, that are, , you know, uh, that, that are, that are confidential, that, that we would, we would have applications there, but, um. But, but even in, you know, things like, uh, if you come to Oshkosh sometime we could show you is that we’ve, um, we are working on the moments of autonomy on, um, uh, the, the refuse collection. Uh, it’s kind of, if you think of the, your refuse truck side, loader truck that, um, goes from a place house to a house, to a house, to a house, to a house, right? So the technology actually, um, identifies that there is a refuse, um, container in, in, in the curbside. It identifies and locates where it is and it stops the vehicle. Either, either the driver can stop it or it can automatically help the driver stop the vehicle. Then it automatically deploys the arm with a push of a button and, and grabs, grabs the, you know, your bin. While it’s unloading, it’s actually looking at what’s the content that’s inside of it and it’s actually weighing it. The AI technology weighs it. Just, just so these, so we’ve got a close core setup that we test these types of things, but then we actually go on, we write with the, with our operate, with the, you know, with the, with our customers and their, and their fleet, fleet operators. And we actually, our engineers go on the go, go, go there with them for on a day out. And collect the data as a part of it as well.

Grayson Brulte: Wow. I’m just. I’m just thinking about this from, I’m gonna put my consumerism hat on. The amount of trend data that that can be uncovered, that’s a whole business into itself. Oh, this house is buying more This, this house is buying this. Oh, they cut back in this, they cut back in this. That’s fascinating. Market intelligence data. So you have, it is, call it your, let’s call it your waste management business. I’m, I know I’m oversimplifying this, but the reason I, I wanna highlight that. ’cause at CES you introduced HARR‑E. HARR‑E is not a person. HARR‑E is a robot who I think is cool that will remove your rubbish from your house. Talk about HARR‑E please.

Jay Iyengar: So HARR‑E is, uh, HARR‑E is again, robotics and autonomy. It’s a concept of, concept of how it’s really cha changing and evolution, refuse collection in, uh, a planned community. So it, it, it’s not for everything. It’s a planned community, you know, potentially in airport if po potentially in a. University campus potentially in, um, STA stadiums, et cetera. So it is, um, the idea is, um, today the big trucks come every, every week and, and, uh, you know, stop by, stop, stop in every house and collect things. Why not? Why not? Why not turn it ups on, on its head where you do it on instead of, instead of it being every day of the week, why not call it on demand? There are times you don’t need to, you don’t need to have it come every week. There are times you, you need probably two times a week. You don’t store bags in your, in your, in, in your bin waiting for the week. So it’s gonna on demand, part of it, part of this where it could come in and, you know, just like you just, it’s hail, just, just on your app, call it, it’ll come to you. It can even come closer to your driveway. So you, it’ll open the, open, the hood, open, it’s a bin, a bin lit, and you, you drop your things in there. It’s good for people who don’t wanna carry large, you know, refuse bags because they’re, you know, they’re elderly people that they can’t, it helps them as well. It just in terms of it, and then what happens in the community. You don’t have a noise in, in the mor th in my community is Friday mornings. Friday morning, you don’t have the noise of vehicles coming through. So this HARR‑E robot collects it, it goes into a central, uh, station, probably within, uh, or right outside the community. And then one of the big trucks can come and pick it up. It’s much more efficient in terms of, you know, usage and, you know, the cost of operations is much more efficient as you think about it. So. HARR‑E can also do, uh, to your point, it can actually say this house make, it can learn a lot about the, you know, behaviors of the consumers as a part of it, right? This house, you know, does something every Thursday, or this house is three times a week or so. It, it can learn more and more about the operations and as it’s going through, there’s a lot of, it can observe around it. It can look for. A, you know, certain, certain things that could be hazardous, you know, a cat in a tree and call somebody, you know, kind of thing. So it’s going around looking for things. Is is the the whole visibility and perception of everything else that’s going on, right? Right. Vision and perception systems. It has a lot of tangential benefits too. So that’s a concept of airy. Uh, it’s disruptive, it’s a disruptive innovation. We think about it.

Grayson Brulte: But it creates a lot of, I think about this as a parent on Christmas day, instead of having rubbish pile up outside and you send six or seven HARR‑E’s and okay, away it goes. There’s a great benefit there. Then I also see a, a benefit for master plan communities to be deployed there, because there are some communities that have been well documented. They, they, oh, it’s an ice, or, oh, it’s a noise pollution. You, you eliminate all that so you can make an argument that HARR‑E increases the, the quality of living.

Jay Iyengar: Completely agree. And, and particularly for those, as I said, those who are physically, physically not able to carry, carry and big, big, big, um, bins or, or push big bins and not everybody can. And again, through bad weather, think about it. Think about, you know, bad weather that they need to push a bin through. So taking everything to a curbside once a week, it seems like it’s a, a bit of a dated idea that can be completely changed. Technology as we are going through, but it’s got technology applies only in certain things. Again, planned communities, it makes sense. Campuses it makes sense. It’s not for everything. You know, one solution doesn’t fit all. But it’s, it’s a good, good, good way to think about as we are thinking through it here, you, as you see a lot of interest from a lot of people. Um, so it’s, it’s been one of those, one of those, um. They’re kinda asking us, when do you go to production with this? When can we buy it? So we’re getting a lot of that kind of a request based off of CES too.

Grayson Brulte: Do you? So if you, let’s just use the master plan community. ’cause it’s, it’s one of the most logical places to deploy HARR‑E, you have a, let’s call it a central rubbish station, which is beyond the perimeter. Then perhaps, if you wanna say, let’s, I’m, I’m putting my Mr. Rogers hat on. You have your neighborhood. And do you have the delivery station next door and perhaps Oshkosh makes a autonomous delivery bot. So when the UPS or Amazon packages come in, it goes in and let’s just call it Oshkosh delivery stops at your house, is that the next logical step where this technology could go?

Jay Iyengar: Know, we showed the concept. HARR‑E is a whole other application, but it, it isn’t just about. If you think of the technology platforms that can be scaled up, um, we showed the, at CES, uh, you know, we are the leaders of delivery vehicles at the, with the U United States Postal Service, uh, which is gonna be the, you know, fairly large fleet and fairly large EV fleet as well. Uh, and it’s not just about the vehicle, it’s also, as I mentioned, it’s purpose design, safety, visibility. Um. You know, it, it is really one of the most modern vehicles that you’ll see out there. Um, and so we showed this vision of next generation of delivery. It’s not, it’s not, not specific about USPS per se. What does delivery could look like? Um, that’s where a combination of hairy technology to a, uh, delivery, next generation delivery vehicle. Again, you can have autonomous operations within. Within the perimeter of, of, of the, of, of the, you know, depot, right? Where the vehicle, somebody can hail the vehicles, it can come in, they can load these vehicles and somebody can still drive it out. And then once it gets closer to a community, ’cause to a neighborhood, they can stop and then they can have someone, like a hairy deploy and go drop the packages off. Those are the kinda ideas that we shared that we, we, it was a concept. Um, we ha we unveiled those concepts at CES this year. So that, that’s the next, next iteration of what could, what is possible, particularly in the delivery space. And, you know, and we, we do see a lot about, you know, is it is not one or the other. I, maybe that’s my message is that it is always a con combination of technology practically grounded on, you know, what the application is and what the, you know, what the situation is.

Grayson Brulte: So if you look at different situations, you have different environments. As you clearly articulate during this podcast, you have different use cases. Which raises a question of sensors and compute. How do you think about sensors and compute when you’re looking at these different use cases? Let’s look at HARR‑E. You probably need less sensors, less compute than when, if you look at one of your very large and industrial autonomous applications, yes, you’re gonna need a lot more sensors, a lot more compute. Uh, how do you think about that? Are you open to exploring new ideas based on new technologies, and you wanna call it the use case and the environments, just the overall size of the vehicle?

Jay Iyengar: Absolutely. You, you have to be, and, and, and. We are, um, by the way, all these vehicles have a lot of sensors and a lot of, a lot of, uh, controllers. So it, there are, we talk about intelligent, I mean, there’s, there’s, there’s a tons of, tons of sensing, sensing, sensing, sensing and control. Sensing perception, controls and actions kind of become a key part of it. So, um, there are, uh, I, I, I think I did mention the technology stack that kind of drives actions there, there are in the tech, within the technology stack. Um, we talk about, I use it term, make buy, um, it loosely sensing platforms that we, that we, you know, can, whether it’s cameras or radar radars or lidars, that’s where we try to draw on, um, you know, where can we commonize that, where can we generate scale There? It all starts off with a software defined vehicle architecture that it can, that can accept these, these sensing. So there’s, there’s an opportunity. We will never make sensors. That’s not our, our thing, but we wanna make sure the requirements are understood and harmonized so that we get the, you know, we get some scale. and then when it comes to some of the compute part of it, we, we do partner with every major compute company. You, you, you can think of, you know, and, and making sure that that is a part of, uh, part of the platform, that application. And then we focus more on the applic controlling the application side. Implementation, verification, validation systems, system design, system integration is what we focus on. Whereas we are partners for all of these other areas. And many times people say, why don’t you just take it from here and put it on this? It’s over-engineered in so many cases, which you don’t wanna do. Right. Um, I get that all the time. Why don’t you just go, you know, and then, and then our, our applications are different that you have to make sure that you are taking into account the environment. This thing, the dust. Take example of dust that the construction is gonna live in. You can’t just throw a camera on there saying it’s gonna be okay. So, so you, you, you gotta think through what technology stacks sort to make sense to what, what problem you’re trying to solve. so we do have a very, uh, my other, um, part of my, I’m, I’m the CTO, I also have strategic sourcing. Uh, that’s also my, uh, I’m actually strategic sourcing officer for the company. What it enables you to do is, um, you know, think of sourcing. Obviously we, we buy, we source all day long. We buy lots of, lots of things. Uh, we have a very strong, you know, supply chain organization, but the strategic part of the sourcing is about thinking ahead and kinda creating the right level of engagement with our tech partners early in, in the life cycle. Because you have to think, you have to, in many cases, the vehicle has to adopt the technology available, whether it’s a camera or a, or a radar in the way we think about it. It all has to come together in, in a, in the right way. So strategic sourcing, having the right partnerships is very essential as they think through to get scale and to control cost and get the right technology for the right product. So that’s where, that’s where it comes together between sourcing and and technology development.

Grayson Brulte: Need harmony there because you’re right. You have to be able to, if you’re gonna source GPUs, you have to be able to source ’em. And you have to be able to source them at market price, not inflated price. There’s a big difference there. And then if a new architecture comes around, if, if ARM comes up an architecture, or if TPU can start powering this or. If X, y, z, uh, processing unit comes out, you have to be able to adapt to that. That’s the thing, which is really great ’cause you have the insight to the technology and you have the insight to the sourcing. I wanna go back in history here for a quick moment. Oshkosh is very famous for the military’s leader follower program that’s been declassified. Now, did you learn a lot from that program? From a technical standpoint? Has any of that, if you wanna call technical breakthroughs that occurred to that program? Gone on to evolve into other Oshkosh products.

Jay Iyengar: Oh yeah. That is, uh, the PLF leader for our program has been, is been, uh. We have been doing, uh, that’s actually level five autonomy because it, it, it is in a closed course. It is in a, a bit of a controlled environment. It, it is a, it, these are autonomous vehicles that follow a, you know, one, one person driving a vehicle. And it’s, it’s obviously for, you know, uh, long convoys things of providing safety and getting people outta the harm’s way. You’ve got the largest fleet out there running, working with the Department of War. And, uh, oh, the learnings have been. Having, having our defense, uh, business really helps us to, to kind of stay at the cutting edge of technology. There are many, many, many technologies that get deployed there. It’s always read across into other applications too. Particularly in that particular case, you talked about, you know, uh, things, an example, details of sensing perceptions and where does radar and lidar fusion of sensing and sense. The fusion of various sensors. What does the algorithm look like? Validation. That’s a really big thing. Big, big part of it. And then, uh, what do you do when you, when you are, uh, when your communications network doesn’t work, right? There’s a lot of failure modes that can happen that you think through. The learning has been tremendous coming from that application. And that’s true for even electrification. There was a, there was a, defense has done a lot of work in electrification long time ago, you know, so we, we. To me, that’s what makes Oshkosh, um, unique and that’s our strength, our product diversity, our end market diversity enables us to transfer tech from one to another. Transfer learnings from one to another, um, both in what to do and what not to do, both aspects of it. That’s, and then, and then really validation is, is it’s all about validation. Making sure that we understand how much to validate and exactly how to validate. And that’s where the IP is many times is, is, is validation of your, of your, uh, you know, of your technology out there in the field.

Grayson Brulte: That’s what gives Oshkosh a massive competitive advantage in the market. At some point, I’m not gonna put on my magic crystal ball over here and make a prediction, but I will say at some point, a physical AI is going to have its ChatGPT moment. It’s just a matter of time, and I think it’s gonna happen sooner rather than later. When that moment happens, where do you want Oshkosh to be?

Jay Iyengar: I agree. It, it’s big. The, the, it’s beginning to happen, right? The conversation’s changing, it’s beginning to happen. So we see us as the company that’s transforming, uh, transforming lives of people, transforming at end markets, right? And that, that was the whole idea of the x of the future that I, that we talked about. Um, and it’s not just about us. It’s, it’s kinda about, you know. Our partners, uh, we don’t provide every vehicle for every application. There’s other, you know, we need to be in conjunction with our entire ecosystem. It’s really transforming the entire ecosystem. And we wanna be, we are the thought leaders and that we wanna be at the forefront of the transformation. We are. The ones actually will be driving at the tip of the transformation, driving the transformation. Um, and clearly being humble about it and, and knowing that we don’t know everything and working. Working very, very closely with the leaders, our customers who are, who live this day in and day out, and making sure our solutions are practical, pragmatic, and implemented. And actually, most importantly, I would say, you know, people don’t care about technology. They care about what it does for them. So if it gets to that phase, I would call that a success. If they’re getting in a vehicle, it’s so intuitive. They don’t even need to worry about whether it’s autonomy or radar or lidar, it doesn’t really matter. It just does the job for them. It helps them do their job more easily, more efficiently. To me, that’s the vision we’re trying to get to.

Grayson Brulte: And when the technology does its job, everybody wins you. You save lives, you increase productivity, and you unlock. Unlock economic activity, which is really great for the global economy. Jay, this has been fantastic. I’ll ask you one last thing. What is the future of Oshkosh? Because I know you’ve built and you’ve been building since the early 19 hundreds, and you can continue to build for another a hundred plus years. What is the future of Oshkosh?

Jay Iyengar: So that’s a, that’s a, um, I, I will talk to, again, we will be the company that transforms the end markets. We will be known as a, you know, you, you, we are everywhere. Uh, you know, the, you, you, you people may not, may not recognize the specific brands, but we are everywhere where. So we will be impacting, you know, communities and lives of people that build communities, that protect communities and that serve communities. So we keep the world moving forward. So, so that is the future of Oshkosh. That’s what we see as to be.

Grayson Brulte: I’ll summarize this way. Oshkosh is building the future. The future is bright. The future autonomous future is Oshkosh. Jay, thank you so much for coming on The Road to Autonomy today.

Jay Iyengar: Thank you. I appreciate it. Thank you.

The future is bright. The future is autonomous. The future is The Road to Autonomy.

Subscribe to This Week in The Autonomy Economy™

Join institutional investors and industry leaders who read This Week in The Autonomy Economy every Sunday. Each edition delivers exclusive insight and commentary on the autonomy economy, helping you stay ahead of what's next.