Chris Yip 0:01 Welcome to Coffee with Chris Yip, the official podcast of the Faculty of Applied Science and Engineering at the University of Toronto. I'm Chris Yip, the Dean here at U of T Engineering. In each episode, I'll be sitting down for coffee with someone from our amazing global community to talk about what they're working on, and how it places us at the heart of bold solutions to design a better world. In this first season, I want to zoom in on "the why?" finding out what drives the curiosity and passion of our extraordinary community. Once you understand that, I hope you'll start to see what makes this place so special, and that you'll be inspired to make, innovate and create along with us. From self-driving cars to helicopters on Mars, autonomous robots are starting to turn up everywhere. I should know. I recently purchased a drone of my own, and I'm pretty grateful for the fact that it knew better than to let me land it on the lake at the cottage. My guests today are two professors who are pushing the boundaries of what autonomous robots can do. They hail from both the University of Toronto's Institute for Aerospace Studies, and our Robotics Institute, which is a multidisciplinary hub for robotics research at U of T and beyond. Professor Angela Schoellig works with flying robots, much like the drone I've been playing with, except she's teaching them how to navigate without the need for human pilots. And Professor Tim Barfoot focuses more on land-based autonomous robots, including some that can propel themselves down steep cliffs. Both are advisors to the student team that created Zeus, a self-driving car from U of t that has placed first in the international auto drive challenge the last four years in a row, an amazing achievement. So welcome to both of you. How are you guys doing today? Tim Barfoot 1:43 Good, doing really well. Angela Schoellig 1:44 Good, thank you. Chris Yip 1:45 Great. So really looking forward to our chat today. Hope you got your coffee ready. None of us have yet to get it delivered by an autonomous robot in our house but maybe that's coming. So first off, I'm going to maybe just start with the usual. Tell me a little bit about your academic background and actually, what sort of drove you into autonomous robotics? And maybe I'll start with Angola. Angela Schoellig 2:06 Yeah, great. So I am really coming more from a mathematics background. I did that mathematics for dynamic systems and how to control these dynamic systems. And so these dynamic systems are systems that change over time or objects that move. And so while I love the math, only do the math was not really that fulfilling so I was looking for tangible examples where we can apply this mathematics and have a real impact in the world. And so if I think about robotics, I really think about kind of the next form of computing that really can interact with the world, and will have an impact in many different application areas. Chris Yip 2:50 So sort of the ability for them to kind of become self controlling, in a sense, being able to deal without external input. I was thinking, as I was flying the UAV, so used to an RC car where you can't let go the controller or it's going to crash, and then you realize you can just like let go of the thing and it just sits there and the wind can blow it but it doesn't go anywhere. It knew it shouldn't land on the lake, which I was quite happy about. So clearly, these algorithms are terrific. So Tim, tell us a little bit about how you ended up in this space. Tim Barfoot 3:20 Yeah, so actually, I am a person who bleeds blue and what that means is that I did all of my schooling here at the University of Toronto from undergrad to grad and now I'm on the faculty here. And I came to robotics through our engineering science program, where I took the aerospace option. I was enamored very much with one particular course that I took an undergrad, Gabe D'Eleuterio's Dynamics course, where I was introduced to the notion of modeling robots. I ended up signing up to do a PhD with with Gabe's group and at the time, robotics was very focused on manipulation. Robots have been used for a long time in automotive assembly lines and things like that at the time. But it was only just starting to become a thing that people were talking about robots that can move in the world, what we call mobile robots. And I just fell in love with the idea of robots that could move around. And really, I'm still kind of just in love with that idea. As I explored different applications of the so-called "mobile robots". It's amazing how challenging it is to actually get a robot to go out in the world where there's so much dynamic stuff going on, and just try to get it to robustly move around from one place to another. And I would say that, although we started working on that 30 plus years ago, in robotics, it's still pretty hard. Like we still don't have self-driving cars parked in our driveway so there's lots of exciting challenges around that, but that still sort of gets me out of bed in the morning today. Chris Yip 4:42 It seems the pace has been accelerated so quickly over the last few years, right? I was just following a little bit of the Olympics and they were sort of showcasing how the villages are set up. They've already got self-driving buses that are going around the Olympic village are ready so you're starting to see these things. It's really interesting to see how quickly it appears the technology has developed but the reality is, as you said, right? Decades of research, decades of initiative, but now this massive growth, could you give us a sense of how quickly the Robotics Institute itself has grown and what its mandate actually is? Tim Barfoot 5:18 The current robotics Institute actually started under another name, the Institute for Robotics and Mechatronics, that was exclusively housed within the Faculty of Applied Science and Engineering here at U of T. And a couple of years ago, we kind of broadened it out and renamed it, the Robotics Institute and a big part of that was we wanted to, in a much deeper way, involve the Department of Computer Science, which actually sits at a different faculty here at U of T. And this is really important, because much of the research and excitement around robotics today is on sort of the software that controls robots. And so we felt that it was super important to more tightly partner with computer science, which we've now done. So about two years ago, we launched the U of T robotics Institute, which actually brings together faculty members from many different departments, from aerospace engineering, to mechanical and industrial, electrical and computer engineering, and computer science. And we're looking to even broaden that out to other areas, including the Faculty of Medicine, and perhaps others. There are probably about 20 faculty members working full time, sort of in robotics proper here at U of T across all of these different departments. And now we kind of sit together under this umbrella of the Robotics Institute and it's been fantastic because it's basically allowed us to establish a real community here at U of T around robotics. I think we're all finding a lot of value in sort of having these very rapid dialogues going on, we have a Slack channel where we're messaging all the time, and we are hosting events, and yeah, it's just been fantastic. Chris Yip 6:45 The big claim to fame right now is that it's the largest concentration of roboticists in Canada is now at U of T. The way you describe it, it really does seem like there's been just a massive step change and I think, as you indicate, it's really just the advances in the software space that's really kind of helping us out and really helping to enable these sorts of opportunities. Angela, could you give me a sense of a recent project in your lab, that's really been exciting? Angela Schoellig 7:10 One area I'm really excited about is how to develop machine learning algorithms for robotics. If we apply machine learning to real world robotic systems, we must guarantee safety, we must have an understanding of how they look and when they fail. And so in my lab, we have developed some cool algorithms in this area that are data efficient, given that data in robotics is costly to get compared to some other machine learning areas and that also shows safety guarantees. And so we have demonstrated those algorithms on flying robots, self-driving vehicles, but also robot arms that are mounted on a mobile base. And so if you would come to our lab, you could, for example, throw a ball and a mobile arm can catch them. And then we have really exciting collaborative projects, which bring different expertise at the table. A lot of my collaborative projects have been with Tim, and we have worked on both self-driving vehicles and for example, using radar to help localization in bad weather conditions. But we have also worked on vision-based flight for drones and maybe Tim can jump in and also talk a bit more about this. Tim Barfoot 8:28 Yeah, so let's zoom into to this one project that Angela was mentioning that we've done together over the last - I guess it's about three or four years now. So we've been working with a local Canadian company, Drone Delivery Canada. And when we're typically buying a drone off the shelf from Best Buy, or something like that, they rely quite heavily on GPS to kind of understand where they are in the world, so the Global Positioning System. I guess what we're trying to do is see if we can try to augment what's going on with GPS using cameras onboard the robot. And so why might you want to do that? So this company, Drone Delivery, is trying to fly packages into the far north to some isolated communities and this requires them to basically fly far enough away from the operator that the operator can no longer visually see the drone in the air. And so there are regulations that come out of Transport Canada around whether you're allowed to do that, and where you're allowed to do that, when you're allowed to do that. And part of making the safety case for being able to fly beyond what's called line of sight from the operator is needing to have some kind of fallback technologies to make the robot safe. So what Angela and I have been working on is the idea of maybe you're flying outbound using GPS to navigate the robot to know where it is, all the while building up a visual map, while you're flying, of your outbound route. The idea being if you were to lose communications with the drone or GPS were to stop working, it could switch over and then use its live camera data to match against this map that it has built on the outbound pass to know where it is and then try to basically retro traverse or come back along that same path and get back to where it took off so that the operator could then see the vehicle again. So you can think of it as like an emergency return function. And there's lots of challenges around getting this to work like out in the real world, there's wind, there's a finite amount of energy and mass that you can deal with up on a drone so trying to get these very sophisticated vision algorithms running in closed loop trying to get the robot to fly stably and safely has been really exciting for us. But we actually have that running now so we have drones up at the Institute for Aerospace Studies that were able to fly them out using GPS, they built this map, and then they fly back just using cameras and referencing against this map. And then they could go and hopefully eventually make the case to Transport Canada to make it so that they could fly beyond line of sight and get the regulations changed. Chris Yip 10:45 So that's a great example of thinking about all the challenges that come from a very simple concept, right? Building in the safety factor is building in the ability to return home. I was thinking about the same thing. We've become so reliant on GPS for things; people driving their cars off docks because they just "I'm following the GPS." I think it's really interesting to think about a drone that can be sort of self aware and think about that and fall back on something like that when another system fails. I gave a little bit of a hint in the start about this cool student challenge, the AutoDrive challenge. Can you give our audience a sense about what this competition was for those who may not have heard about it? Angela Schoellig 11:26 AutoDrive was a four-year student competition sponsored and initiated by General Motors and the Society of Automotive Engineers (SAE). The amazing thing is that in the first year already, they built a self-driving car from scratch in seven months with a team of undergrad students and some help from grad students. But I think that's just an amazing feat, doing that in seven months, and then competing in the US and winning this first competition. And then from there, the competitions get more challenging and redeveloped, you know, advanced features and capabilities for the self-driving car. And at the same time, students get an amazing educational experience and they're very sought after by the top leading self-driving car companies after they graduate. Tim Barfoot 12:22 It was a competitive process to become one of the eight teams involved in the first AutoDrive competition so we had to submit a proposal and we were accepted. And we became one of two Canadian teams along with the University of Waterloo. So GM basically donated a Bolt EV, which is a car that you can actually go out and buy at a dealership, and they provided some information to some of the low level systems in the car so that we could connect computers in and be able to command the speed of the car and making it be able to turn and things like that. But basically, from there, the students had to build everything up almost from scratch. So they had to select the sensors that they were going to put on the car so that it could see, physically mount those, write all of the software completely from scratch, so it's basically on a very shoestring budget. And in their spare time, students did the same kinds of things that many companies, startup companies, are trying to do in the self-driving car domain now and I think they knocked it out of the park, in my opinion. I think, just every time I see what they're doing, I'm so impressed, so impressed, just the things, the sophistication of their thinking, not just around the specific algorithms or components, but the way they're approaching engineering and thinking at the system's level, they're basically doing the exact same thing that a company would be doing in this space. Angela and I are super hands off. The students are basically doing everything and we're just providing programmatic advice. Angela Schoellig 13:52 I think what we did and U of T did is to create the right environment for the students to be successful and students took it from there. Chris Yip 13:59 And that's what you love to see. I remember that when I first started as Dean and I went up to visit the labs up at Aerospace and Robotics Institute to get a drive in Zeus. And I remember you programmed in the course the day before and it wasn't raining and I showed up for my tour and it was raining and you said to, "Get in the car and let's see how this works," and it was amazing. I guess the grad student sitting there but has crossed hands, not actually looking to the steering wheel. To see the LIDAR working and to see the raindrops basically coming across the screen because it's detecting it, it knew when to turn right and when not to turn right, I was just - I was blown away. We've talked a lot about the opportunities that are emerging in the robotics space, in the autonomous robotic space. What do you feel might be the next applications that would be really sort of commercially feasible over the next five or 10 years? Angela Schoellig 14:55 So from my perspective, I think there will be still many places where robots will play a role that are not visible to the general public, such as logistics, mining, manufacturing, in the energy space in general. And I think there are lots of low-hanging fruits that when people would go in and just understand the challenges that these industries have, and then use the capabilities that we have built in robotics, they can help and solve those. But of course, we are also very excited about applications that robots are visible, such as on the road, it's an incredible process, like there will probably be autonomous trucks and self-driving buses first, and not your personal vehicle, will be the fully autonomous in the next five years. I also think in healthcare, just supporting healthcare workers such that they don't have to do routine tasks, but kind of focus on the important aspects of that job. Chris Yip 15:53 Yeah, we're starting to see a lot of that in sort of the social interaction robots in elder care, for instance, for patient monitoring. I was very interested actually and Tim, we've had this discussion a little bit about autonomous systems in the mining industry, where you've got very routine paths that trucks or haulers are taking. And we've had discussions with companies where they're already implementing this and Angela, as you mentioned, in commercial applications, where again, you don't see it in the public sense. Tim Barfoot 16:21 Yeah, I'm going to echo what Angela said, I think the public tends to get very excited about what's reported in the media, and the idea that we all might have a self-driving car parked in our driveways, but I don't really see it unfolding that way. I think we will see these applications that are a little bit more out of sight happening first. I used to actually work on autonomous haul trucks, as you mentioned, but those are hidden underground, and people don't see them. I'm actually going to quote Angela's PhD advisor, Raff D'Andrea, who is a former EngSci student. He once told me that it's very important in research that we work on pushing the envelope and really understanding what's easy and what's hard so that when we go and build real systems, we can gravitate towards the easy side in order to sort of roll those ideas of first and I think that's what will happen in the self-driving space. We're seeing companies that are public transportation buses driving on fixed routes, right? So you only really need to get good at those particular routes rather than sort of have an infinite number of situations to contend with. Or these companies that are working on delivering your groceries or your pizza, because now you don't have to deal with the comfort issue or if the vehicle gets in trouble or if it's just delivering something, it can sort of pull off to the side of the road and wait for some support. If there was a passenger in the vehicle that might not be tolerated. So I think all of these kinds of things are going to roll out first. And we're going to learn a lot about that and push the envelope and build the technology and our ability to operate in more challenging situations will grow over time. And I also think, on the road, a lot of the infrastructure is the traffic lights, the signage, and all of that, it's very much been designed for human drivers to interact with. And certainly one way to make all of these problems easier is to redesign some of that infrastructure, like traffic lights that communicate with cars, so that we don't have to build deep learning algorithms that detect traffic lights and cameras, we just have to build a radio receiver that gets the traffic light state directly from from the light itself. I think a lot of things like that could be rolled out gradually. And they could be first used as safety features in our human driven cars. And then once all that infrastructure is available, it'll become much easier to sort of roll out these more complicated, sophisticated applications of self driving. Chris Yip 18:33 Yeah, I think this really reflects what you both mentioned a little bit earlier, which is really this is a systems design challenge. I'd be really interested in your thoughts around - we've talked a lot about land-based robots and flying drones, but what about the marine environment? Tim Barfoot 18:50 Yeah, so there's actually some marine robotics activities just starting up here at U of T. So Florian Shkurti, who's an assistant professor at the Mississauga campus in computer science has a track record in this area and spun up a project on water quality monitoring, and I'm joining him on that. Canada has got 2 million lakes and there are many programs out there. Part of contending with climate change is being able to monitor what's going on with these lakes. I know you have a cottage, Chris, there's probably a lake association on your on your lake and hopefully they've signed up for the Lake Partner Program where there's citizen science going on to monitor the quality of the water from year to year. I'm involved in the same thing on my own cottage lake, and I think there's lots of opportunities for robotics to help some of that environmental monitoring, right? So trying to deploy robots to go and take different observations in order to at least be able to measure what's going on with the environment as the climate starts to change, right? As you start to look at different levels of algae and E. coli and things like that in the water. Chris Yip 19:54 So Tim, I'm going to go backwards on you a little bit. There's a little bit of a connection between us in the context of your public school. You went to the public school my kids are going and I went to. Was there anything from your public school days that you remember that might have inspired you to head down the robotics track? Tim Barfoot 20:11 Well, I don't know if my memory is that long but I guess, you know, many memories of an uncountable number of hours playing with Lego sets. And I remember the first time I was I got my first two motors as part of a Lego set and the very first thing I did with those two motors was I built a robot hand that could actually like twist and grab things with absolutely no knowledge of what a robot really was. But I just was like, "Hey, I can make something like my hand." And then in high school, I was very fortunate to have a couple of friends who were nerds like me, and we got a grant for $400. It was my first grant, $400. And we actually built a mobile manipulator buy ourselves out of like wood, and I don't even know what, plastic gears and things. And we built this robot and it could drive around, and it was computer controlled so we wrote all the software, and we even built like a 3D visualization interface, so that we could visualize what it was doing in the world. That type of hands-on experience has been critical to my path in robotics, being able to actually not just sit in front of a computer, but get out and do things with my hands and I think that's what I like about robotics and I guess that's the type of opportunity that I'm hoping we can provide for our students through things like AutoDrive. Chris Yip 21:32 Angela, what was your inspiration if you went all the way back? I pushed him all the way back to grade six. Angela Schoellig 21:39 Yeah, I did not start with robotics until my PhD, really, I think. But there were a few glimpses of robotics along the way that I didn't kind of get at the time. First, I joined a programming class that was not mandatory in my high school and I don't even know why. I just wanted to do it. And I was curious, I guess, what it is and what we can do with it. And we have a computer at home and I tried everything that was possible at the time. And then robotics crossed my path a few times but in unintended ways. For example, in my undergrad, I attended like a two-week what I thought is kind of basically a vacation in the mountains with a lot of hiking, but the theme was robotics and there was always robotics classes in the morning. So I appreciate these early efforts of people kind of showing me robots. Chris Yip 22:37 I'm going to end off maybe just asking you each just a quick question. So maybe I'll start with Tim. What do you see as the next big challenge in robotics? Tim Barfoot 22:46 Yeah, I don't really have a one thing, I guess. I think there are many challenges. Robotics right now has sort of made a promise to the public that there's going to be all of these interesting things happening. The challenge is going to be to deliver on those promises, right? And I think the challenge maybe for the world with robotics is understanding the difference between science fiction and science fact. Many of us have sort of learned about robotics through science fiction, and the reality is that it's a lot more complicated than the movies make it look. So I think people are just going to have to be a little bit patient and understand that the technology has a long way to go to catch up with with what we might think it can do. Angela Schoellig 23:27 Robotics is enabled by lots of different technology, like better CPU, cheaper sensors, better machine learning algorithms and then we need to cleverly integrate these technologies and that is an interdisciplinary process. A lot has happened on these enabling technologies and that's why we are so excited about robotics and think robots become much more capable in the next 10 years. But that means that robotics becomes an enabling technology for lots of applications from mobility to healthcare, transportation, manufacturing and I think that even more interdisciplinarity and collaboration required to understand the challenges in these different areas and how robotics can really be applied in a reasonable and appropriate manner. Collaboration will be really essential to make that leap towards those new application areas. And that's why I also think really everybody should join us in making this happen. Like from different disciplines, different backgrounds, because that's really needed to make that leap. Chris Yip 24:36 Terrific. Those are inspiring and engaging advice, actually trying to get everybody to join in. I think there are huge opportunities going forward. Yes, I think science fiction has has given us almost too much of an anticipation about what it can do but as we've all seen with sort of Star Trek, it gives us an inspiration of what we should try to make. We strive toward those goals and Tim, I think you've framed it nicely. You need to be patient and thoughtful in what we're trying to do and what we're trying to accomplish. And I think, while I do look forward to when I can have an automated robotic person in my house delivering my coffee, I will still enjoy brewing it on my own. So thank you so much for taking time today to chat about what's going on the field of robotics. It's been terrific. Thank you. Angela Schoellig 25:22 Thank you. Tim Barfoot 25:23 Thanks a lot, Chris. Chris Yip 25:25 Thanks again for listening to Coffee with Chris Yip. If you want to catch up on past episodes, or make sure that you don't miss the next one, please subscribe. We're on SoundCloud, Apple Podcasts, Spotify, and more. Just look for Coffee with Chris Yip. You can also check out @UofT Engineering on Twitter, Facebook, Instagram and LinkedIn for more stories about how our community is building a better world. And finally, if you've been inspired to join us, we'd love to welcome you. Whether you're thinking of taking a degree or working with us on our research project, you can find us online at engineering.utoronto.ca or you can visit our beautiful campus in Toronto, Ontario, Canada. I hope I can join you for coffee soon.