Matt has a fun conversation with Adam Sachs, the CEO and co-founder of Vicarious Surgical, who is developing a tiny, surgical robot that could radically change how surgery is performed.
YouTube version of the podcast: https://www.youtube.com/stilltbdpodcast
Get in touch: https://undecidedmf.com/podcast-feedback
Support the show: https://pod.fan/still-to-be-determined
Follow us on Twitter: @stilltbdfm @byseanferrell @mattferrell or @undecidedmf
Undecided with Matt Ferrell: https://www.youtube.com/undecidedmf
Very convenient. Yes it does. Also, alongside Matt Ferrell is Sean Ferrell. That’s me. I’m a writer I write some sci-fi. I write some stuff for kids, including the recently released the Sinister Secrets of Singe, which is in bookstores now and Matts. Conversations on undecided with Matt Ferrell Usually take a look at tech from a sustainability perspective.
But Matt, I understand you’ve done something differently this time. Yeah. And what we’re gonna be talking about today, effectively, Matt and I are not gonna be talking today. It’s gonna be Matt talking to Adam Sachs, who is the c e o and Co-founder of Vicarious Surgical, and Surgical being the operative word there.
Mm-hmm.
The operative word, Uhhuh.
See, you’re not the only one. The one who can do it,
listeners don’t understand is Sean’s coming up with these on the fly,
right? That’s right. The pun is strong with this one. . Yes, it is. So Adam Sachs, who’s the c e o and Co-founder of Vicarious Surgical. Matt had a long conversation with him, which we’ll be cutting to in a moment.
But before we do, Matt, I understand you are kind of not rethinking your approach, but broadening your scope.
Do you wanna talk about that a little bit? Yeah, exactly. My channel from the very beginning has this tagline at the be beginning of every video that says, um, exploring how technology impacts our lives.
And past few years it’s been a heavy focus on sustainability, renewable energy, those kind of things, ev, solar panels, batteries, and I’m, and understandable
because that is like, that’s constantly in the news and it’s being hammered over our head that sustainability is the thing to talk about, to think about for the future.
Absolutely. So it’s understandable that you would’ve instinctively been taking that
approach. Yeah, but there’s also, I have a lot of interests, which is part of the reason the channel’s called undecided is I, I have a lot of different interests and I kind of wanna bounce around. And as I’ve been looking to other technologies, medicine is one other area where there’s a lot of really interesting technology advances happening.
And so I’ve been starting to look into robotics and things like that and how they’re being applied to medicine. And one of the companies I’ve been talking to recently is Vicarious Surgical about what they’re doing ’cause their, their robotic system. It’s going to make for less invasive surgeries, which means faster recoveries, less prone to error, those kind of issues that could come up.
Safer surgeries, quicker surgeries, easier recovery. And this could talking about having an impact on our lives. This literally has a direct impact on your life, right? If you have to have a surgery like this. And so I thought a good kind of like dip of toe in the water to start talking to companies like this and sharing some of my conversations with those people.
To give a little bit of background on what Vicarious Surgical is. Vicarious surgical robotic system is designed with a focus on abdominal access and visualization through a single port. So going in for surgery on something that normally would’ve created multiple incisions, one small incision, and everything goes in through there.
And it really does sound star Trek. It’s like it’s this close to saying like, we don’t even need to cut you
open. Yeah. Well, what’s really cool about this little robot is when you, if you see, when you see the images of it, it looks. It’s a little creepy and also kind of cute. It kinda looks like a little praying mantis.
It’s like out of this little tube emerges this little thing with like, it looks like it has little light bulb eyes and a little like stereoscopic vision. And these little hands come out and basically it’s operated by the surgeon has like this little praying mantis thing inside and can do what it needs to do and has full dexterity to be able to do, you know, sutures and all that kind of stuff inside.
And then they just suck it back up into the little tube, pull the tube out and you’re done. So it’s like, it sounds gross. It’s absolutely fascinating. It’s really, really cool.
It does sound fascinating, and I’m absolutely terrified and can’t wait to watch the interview that you had with Mr. Sachs. So let’s cut to that now.
People please enjoy and stay tuned for the conversation between Matt
and Adam I. Well, thank you so much for joining me today. I was just hoping you could kind of give me a little background about not only your company, what the crazy stuff that you’re kind of doing that’s really innovative, but also yourself, like how did you get your
start?
Yeah. So, uh, to take my own background, I’m an engineer. I studied it in school. I went to m i t, studied mechanical engineering and very much focused on biomedical engineering as well as robotics. I. You know, I’ve always been incredibly interested in both medicine and robotics, and thought that it was kind of the perfect combination from the very beginning and a after that I, I’ve spent a little bit of time, uh, working at Apple as well, on on iPhone manufacturing, but really started this company pretty early in my career.
So, so what was the inspiration like? Was there something that you were seeing that was missing in the medical space with robotics that you saw? Like what was the inspiration that brought the idea together for creating Vicarious?
Yeah, there’s no one specific inspiration. I’d say There’s a lot of things that have really inspired what we’re doing today.
You know, the movie, fantastic Voyage is a movie where a team of doctors and scientists go into a submarine and they’re shrunk down to the right scale to, to operate on the human body in their case, you know, going in through the bloodstream and, uh, much smaller than we’re doing operating on abdominal organs.
But this really. Pretty quickly made us realize that humans are the wrong size to operate on other humans. And if we can use robotics and you know, a lot of modern technology that’s really come out only in the last 20 years, we can architect a system completely differently from what exists today in order to enable it to shrink a surgeon down and put them inside of their patient.
Another movie that pops to mind is Innerspace. I’ve never seen that same, you’ve never seen that. It’s the, it’s the same basic premise of shrinking somebody down, putting them inside of a body for whatever uses that you need. So it was basically like, as you pointed out, like a human hands are a little too large to be able to do some of this intricate work that we need to do inside the
body.
Yeah, and, and the way it works with abdominal surgery today is that the surgeon is operating from the outside. In. So either they make a large incision in the in, in the abdominal wall, and that incision causes a tremendous amount of injury to the patient. Almost 20% of patients that have a large call, call it an open surgical incision in the abdominal wall.
End up with a complication that is a failure of that abdominal wall to heal correctly after the surgery. And the alternative to that is laparoscopic surgery today, of which there are robotic versions of laparoscopic surgery, but there really are roboticized laparoscopy where you’re operating with these long, slender.
Sticks from the outside in. And the kind of end result of all of this is that 97% of surgery today is performed manually. Uh, almost ha About half of it is performed with large open incisions and absolutely none of it as any amount of assisted intelligence or any amount of, of automated patient protection.
So could let’s get to the details of how your device actually works. ’cause you it’s still using like those sticks or those rods. To get into the body, but then once it’s in the body, it’s almost like a miniaturized doctor that’s inside, correct?
Yeah. E, e, exactly. So existing surgical robots are, are built on a cable driven robotic system, and they have something in them that we call in, in robotics coupled motion.
And what motion coupling means is that each of the joints is. Coupled to each of the other joints. So when you move one joint, you end up moving all of the distal toward the tip joints as well. And you can correct this in software, but the, the, the problem with coupled motion is that you get an exponential buildup of.
Force. So you actually double force joint by joint as you go on the arm, up the arm. So the force that you’re delivering to the tissue, you know, if you have just one joint, you need one x. That force on the control cables, if you have two joints, you need double the force on the control cables to deliver the same tissue.
If you have three joints, which is what they have today, you have four times that force. So that’s why today’s surgical robots have only a wrist on the end of a stick inside of the abdominal cavity. That that’s it. They have no elbows or shoulders. So what we invented and, and have spent the last, you know, decade or so perfecting is a design of decoupled actuators so that none of those joints are connected to the other joints.
This lets us have as many. As we want, all with small polymer control cables because we’ve eliminated all of that exponential force buildup. And the result is that through one incision, we can put in one incredible imaging system and two cameras each with nine degrees of freedom from the inside. Plus four about the incision sites.
So 13 total, and that really provides wrists, elbows, and shoulders for the surgeon, plus the ability for the surgeon to move around. Inside of the abdominal cavity. So what’s the experience
like for the surgeon?
So the surgeon sits at a console. They look into a screen that is sort of halfway between VR and a normal screen.
So it’s actually a VR headset that reflects off of a flat mirror and then a parabolic mirror. So that they get the immersive experience of VR without having to wear anything on their head, including even having to wear polarizing glasses like, like you would in a three D movie theater. And with this, they then control the robot through an interface that frankly is pretty similar to existing surgical robotics.
Uh, it’s essentially one of the really interesting things about this. Project is when we started it. We started this with a true VR interface where you have free floating hand controllers. You can move your hands anywhere. You can look around. I will say it is the most incredible experience in the world to actually do that.
Uh, you feel like you are this little robot, but at the same time, it, it would be like if you changed the pedal layout in, in a car today, you know, maybe the brake should be on the right. It’s probably a good idea to leave it on the left, even if you’re designing a Tesla.
So from a surgeon’s point of view, I didn’t realize it was also stereoscopic.
So, It feels like you’re there. So it gives them a sense of depth and they can feel like they’re actually operating as if they were just like operating on the, by themselves with
their own hands. Yeah, exactly. So, so with our decoupled actuator, so everything goes in through this one 18 millimeter incision.
Mm-hmm. Which is much smaller than any other single port surgical device that that exists today, robotic or otherwise. And once it’s inside, we have, because of the, those decoupled actuators, we can, we have rotational motion inside of the abdomen. So we have a camera that can jaw pitch and roll from the inside, which means that it can move, go in and up and out of the way, and then allow robotic arms to be inserted underneath.
And what this enables is our camera has much, much more space than any other surgical camera that exists. So we’ve actually brought a ton of cell phone technology. Into our camera system. We have two incredible lens stacks and focusing mechanisms with, uh, amazing sensors that, you know, again, we didn’t develop any of this.
We buy it off the shelf for incredibly low cost. What we developed is the. Hardware architecture that allows us to pull it in. On top of that, we can fit things like structured light projectors, lidar chips. Again, similar to what exists on iPhones today, or even multiple levels of fluorescent imaging that allow you to separately identify in real time all of the critical anatomy within the abdominal cavity.
So, for example, you can fluorescently dye. Nerves, blood vessels, ureters, even their dyes in clinical trials for cancer today. And we can use all of those in the background for the entire procedure. Wow.
So have you, where are you in the stages of getting approvals for this to be actually used in. Clinical use.
So we’ve locked the first versions design and we are building it up right now. They’re building up a number of units. Then we take them through, uh, what’s called verification and validation testing. Uh, this essentially, you know, ensures that your system meets, you know, your own system requirements, and then that it meets the, the needs of the users.
And then into a small clinical trial next year. So we’ll be doing our, our first clinical patients and then around the end of next year filing for f d a authorization. Have you been
working with any surgeons up until this point to kind of get feedback on how the system works?
Yeah. We’re incredibly privileged to have an amazing group of surgeons.
So we, we have a surgeon luminary group, 20 of the best surgeons in the world. On top of that, we actually have. Three healthcare systems that are working with us, and actually in one case, even investing capital into our company to, to fund us. Uh, this is H C a Healthcare U P M C and, and University Hospitals.
Together they represent, uh, over 200. Hospitals and over 150 ambulatory surgical centers in the United States, and they’re partnering with us to give us not just surgeon feedback, but also, or team feedback, administrator feedback, getting hospital CFOs involved to make sure that everything that we’re doing meets the needs of patient surgeons and hospitals.
Wh what
kind of training is gonna be required for a surgeon, like for somebody that sits down at this for the first time, how quickly are they gonna pick up the controls?
So it’s a really interesting question. The formal answer to that is part of the f d A clearance package, but I can tell you what our experience ha has been so far, which is that, you know, typically we have a cadaver lab, uh, in, in our facility here in Walham, Massachusetts, and we bring surgeons regularly through not just ones that we’ve.
You know, worked with in the past, but new surgeons on a regular basis and we have a full simulator program to bring them through the simulator program to train them. And then we bring them into the cadaver lab and we, we, we see incredible success. A after just a morning or even an hour or so. Of simulated experience.
And that really is significantly because we’ve tried to thread that needle of providing all this additional capability for the surgeon, but through an interface that’s really familiar to them, at least for any surgeon who’s used surgical robots today, what’s been some of the
biggest challenges? ’cause as you mentioned, you’ve been working on this for, for years and perfecting the system.
What are some of the biggest challenges with building out the robot that you’ve come up with, that you’ve come up against up until this
point? Yeah, I’d say each stage brings its own set of challenges. So in, in the beginning it was all the engineering challenges around getting our decoupled actuators to work.
That that was a multi-year challenge that we had to solve and that that’s where, you know, a lot of the tech investors like, like Bill Gates and Coastal Ventures and innovation endeavors really, really backed us for that technology. And then moving into the clinical stage and really turning it into a product that surgeons can use.
And today the challenge will be getting through the F D A process, which is more, you know, a matter of going through the steps and just ensuring that we do that with incredible rigor to be as certain as we possibly can that, that our clearance process will be successful.
So. So let’s say you get the clearance, you get the F D A approval.
What do you see? Are there specific treatments or surgeries that you think this is gonna be targeted to First? Like, what do you think is gonna be the first application for this?
The first application is gonna be ventral wall hernia repair. So this is a defect on the ventral wall that’s, you know, front the of the patient’s, uh, abdominal wall.
And this, this happens when the ventral wall, uh, essentially layers of, of the abdominal wall fail, and you can end up actually with incarcerated organs, typically a loop of bowel. Uh, and that that can actually even be life-threatening for a patient at the very least. It’s painful, it’s disfiguring, and very uncomfortable.
So to do this repair today, you, it’s typically done with open surgical technique. Uh, the vast majority of these from data from our, our hospital partners, are performed with large incisions and done layer by layer from the outside in. We are designing our system to be able to do that abdominal wall repair from the inside, all through a single small incision.
And that that really is just the first application though we do cadaveric work for a whole variety of different procedures and we’ll be filing in tight sequence to, to enable our system to do hysterectomy, Oophorectomy, cycl, colpopexy, uh, other gynecological procedures on the pelvic. Floor to be able to do bowel surgery, to, to do, uh, almost all other hernia repair, including inguinal hernia, groin, that’s groin hernia, as well as hiatal hernias up in the diaphragm.
And, uh, cholecystectomy the removal of the gallbladder to get back,
one question just popped in my head for the, the robotics of how it works. Are there haptics for the doctor to kind of get a sense of what’s actually happening?
Inside. Also a really interesting question. So again, because of our decoupled actuators, we have full sensing throughout the robotic arms for e every force that’s exerted on the tissue.
And that’s unique to our technology that we can do that. So our surgeon console actually is capable of providing that feedback to the surgeon, uh, via like true haptic feedback so that they can feel the force also. Incredibly cool is an experience, and what surgeons tell us is that you knows, surgeons typically think they want that until they try using it for a whole afternoon and they quickly find that, you know, if you had the option.
Between doing something that that requires physical exertion all day and doing something that doesn’t require physical exertion all day. I think most people would choose that the non-physical exertion actually on top of that motion fidelity, when you turn on haptic feedback actually goes down because people start to get tired.
And when they get tired that the fidelity of of their fine motor skills drops. So, The kind of net of all of this is that what you want isn’t haptic feedback. What you want is the sensors that we have then coupled with an amount of intelligence that can tell the surgeon when they’re exerting too much force, when they’re approaching the limit for a specific tissue, when they’re or, or have they done, you know, tied and not in intentioned it with too little force.
E each of these is an incredibly valuable addition and replaces the need for genuine true haptic feedback. So it’s,
it’s not a one-to-one haptic feedback. It’s like you can dial it up just as much as you need for those, those edge cases to make sure you’re not doing harm.
Exactly, and uh, actually don’t even have to provide it directly to the surgeon via physical force.
Really what the surgeon wants when they’re suturing is to make sure they don’t exert too little or too much force. So, you know, starting with warnings in that situation and then moving to some amount of simple automation to ensure consistency and, you know, make it a little bit like the, uh, autonomous.
Lane keep that most cars have today that,
that actually ties into one of my questions I had for you, which was this, in a nutshell, this feels like it’s an extension of a human surgeon. There is probably a lot of software and automation and things that are making this all work. How far can you push that automation?
Like is there a point at which you don’t need the surgeon and it’s a robot doing a surgeon surgery on itself by itself? Or are we way, way, way, way away
from that? So, all right, let’s start with the, the easy to answer questions there. Yeah. Which is, uh, around the value of robotic surgery. So, robotic surgery today is all roboticized manual surgery.
It’s the same technique, same basic tools. Add a wrist to help with suturing and then put it on a robot. And the, the value of that is relatively limited. The cost is fairly high, and the result is that, You know, 97% of surgery is done manually. The real value and the potential to transform robotics into the actual standard of care for surgery comes from providing additional assistive intelligence to the surgeon and to the procedure and to, to actually protect the patient from, from inadvertent injury.
Because surgery today re relies entirely on the surgeon who is a human being. To make no errors throughout the entire case. That is a bit of a ridiculous proposition. It’s incredible how seriously surgeons take their jobs, and yet they still do make human errors, so, The way we view robotics is the opportunity is to actually put software in between the surgeon’s decisions and what happens on the patient.
But to do that, it requires a couple of things. First, requires a much more cost effective, easier to use architecture so that you can actually quite literally get inside the operating room and get inside the patient. Right. If you’re not in the OR can’t actually do any of this, then once you’re in the patient, you need the array of sensing.
You need that fluorescent imaging so that you can identify what tissue you’re interacting with. You need the lidar chip so that you can map the abdomen in three D. You need the fourth sensing so that you actually know how much force you’re exerting. And once you start to have all of that, you’re not just building the self-driving car, you’ve now.
Painted every road line, fluorescently, one color, every car, a different color, every pedestrian, a third color right? You, you’ve actually started to build out the world to interact with that. That is much easier for a computer and for software to to understand. And once you’ve done that, Which is exactly what we’re doing.
You can then start to actually add in some of these automated patient protections. So actually preventing the surgeon from inadvertently injuring something, bringing up warnings when they’re exerting too much force on a specific tissue. And that is the key to unlocking that, that automation that that you’re talking about.
So first of all, the biggest value comes from exactly that. It comes simply from, you know, a a few percent of, of, uh, pelvic floor procedures have ureter urinary anatomy injuries, which are devastating injuries that end up with urine leaking into the abdominal cavity. If you use a fluorescent dye to highlight the ureters, map them in three d alongside the, the tips of the robotic arms, you can ensure that the surgeon never hits that anatomy.
And not only does that remove that couple percent devastating injury as well as frankly the financial cost that comes with it, but longer term, there’s also the opportunity to then have the surgeon not driving every piece of the procedure to start bringing in simple autopilot tasks to even let a physician assistant, the first assistant for the procedure do more of the procedure because you’re putting strict guardrails around what they’re doing.
I was gonna say, this isn’t just about making it less intrusive to do the surgery, it’s also making the surgery far safer because you can put those guardrails in place and help a doctor be a
better doctor. That is, that is exactly what we are designing our system to be able to do. Okay. Is we’re designing our system with all of these guardrails to ensure that you know that a surgeon, Is warned before they make a mistake.
And, and the goal of all of this, the, the design intent of all of this I is to allow for significantly better patient outcomes or probably more accurately to remove all of the bad patient outcomes. So how
do you see this, not, not just your technology and what you’re doing, but like robotics in medicine in general, how do you see it impacting medicine in the next
five to 10 years?
Well, I’ll say in medicine, five to 10 years, not a very long time. Um, yeah, very short. So I, I think, you know, we are gonna have a, a, a huge impact in soft tissue medicine. There are also a number of other players coming out with, I, I’d say just lower cost solutions overall that I think will play a role and, and start to allow robotics.
To be a, a bit more u ubiquitous, although still with the relative, you know, with, with only the value of Roboticized laparoscopy ra, rather than all of this additional patient protection capability. I, I think, you know, timescale of more, you know, uh, more in line with some of the future stuff that we’re talking about in thirty years.
I really believe that surgeons jobs will be much more focused around. Planning a procedure, working with the surgeon, setting the goals, and then we’ll be a able to take a step back and allow other staff to do a lot of the steps of the procedure, maybe be present for the critical steps of that procedure, but allow an assistant to do the rest.
And interestingly, that’s also where the real financial value and the ethical value from expanding a surgeon’s capability comes from. Because unlike a self-driving car, right, if you have. A self-driving car and you have to pay somebody $15 an hour to sit in the passenger seat, monitoring the self-drive car and make sure it doesn’t do something stupid.
Mm-hmm. You’ve lost all the financial value of you’re self-driving car. If you have to do that in surgery, you’ve lost a couple percent of the financial value of, uh, autonomous surgery and are still retaining most of the value.
That’s actually gonna lead into one of my next questions, which was around the finances.
How is this going to impact the finances around surgical procedures in the coming years?
So I, this is one of the, the most important pieces of what we’re doing, especially in the us. You know, healthcare needs to be cost practical and cost effective. It’s one of the biggest segments of G D P. It is incredibly important that we provide the most cost effective, excellent quality care that we can.
So because of that, we’ve architected our system to have to be cost practical, removing these gigantic robotic arms in the operating room. And. Frankly, quite literally miniaturizing everything and putting it inside of the patient. Also, streamlining the workflow so that surgeons can do procedures more efficiently and more practically.
Aiming to reduce turnover time in the operating room, less things to drape less, incisions to make less ports, to put in everything, you know, aiming for everything to just flow and, and even removing, you know, some of the training hurdles and burdens around. Port placement and collisions that exist in, in surgery today.
So, you know, really having an eye on not just one piece of this, but rather every piece in the entire ecosystem around it and ensuring that everything is done as efficiently and as possible. And then the, the last piece is, you know, we are designing our system to really, to have everything it needs to be able to prevent patient injury, especially in the worst possible cases.
And that those are incredibly costly outcomes even. Before you take into account any liability associated with it, you have to care for those patients and that care can be incredibly expensive. Do you see
that this will reduce f fatalities or complications from surgery by with systems like this? I.
So I, I wouldn’t say that’s a loaded question at all.
Uh, you know, I, I’d say obviously until we have clinical data to back, it can’t provide any promises. But the reason we’re here, the reason we’re doing this is that the answer is absolutely yes. I, I believe our system and we are designing our system. To be able to reduce patient injury and to be able to reduce fatalities and, you know, the, whatever number, it’s, it’s hard to get good data on this, but whatever you think the error rates and complication rates are in surgery today, they’re both a amazingly low when you look at the fact that it’s human beings doing everything and human beings get tired.
Human beings have bad days. You know, if I. Ha have one bad day at work in a year. It’s not a big deal. I, I still had the remaining 364 days of the year. If a surgeon has a bad day at work, yeah, somebody might die. And I, we, I. I really believe we have a real shot at not just chipping away, but actually making a real dent in that.
Two final
questions, one of which is, what do you know now that you wish you knew at the beginning
of starting your company? I would say one of the biggest things though is, is how much you know your job in leading a company, especially when you depart from maybe 20 or 30 people and get into 50 plus people, uh, your job is to hire the right team.
Give them the resources they need and let them run, and then make sure that they are doing, you know, what they need to do. And if you’re, they’re not it, it’s, you know, you, you work with the people you have or you change the people. Those are your levers. Right. You, that is kind of the job of running a company.
What advice
would you give to an aspiring entrepreneur or innovator that’s trying to get into medicine, robotics? What advice
would you give them? Work with a lot of surgeons and hospitals early, that that would be the number one piece of advice, right? Everything comes down to the needs of the customers.
And then sort of corollary to that is it’s still your job to make the decisions though, right? Your your, your customers don’t actually. You know, have that holistic insight. So your job is to collect all of that insight, combine it with, you know, your yourself or your team’s engineering understanding and clinical understanding, and build that into a product.
Is there, is there anything else we haven’t touched on? Around vicarious and the robotics that you’d wanna touch on.
There are so many things I’d say come by. It’s a robot. It’s so much better and so much more fun to actually drive it yourself compared to. Seeing a video of it or just hearing me talk about it.
Right.
Could I actually do that? Would that be a possibility to come out and see it? Yeah, for sure. I would. I would love to
do that. Yeah. A absolutely we can find, find the time. It’s so much better. It, it’s, it’s like a little teeny robot that’s about this big, that you can literally drive and play with things.
The images of it look
like a little praying mantis. It’s got this
kind. I mean, yeah, when the, when the elbows are down, it really looks and feels like a, uh, a praying mantis. It’s a little creepy to be honest, but I, I like it better when it’s a little more humanoid and the elbows are at like a natural height.
It’s, it’s creepy and also cute at the same time. If you ask me, it’s, it,
it’s, and, and the like, Stewart Griffin head, I think helps with the cuteness, the like, Flat top head. Yeah, it’s, that’s perfect. Yeah, it, it’s somewhere in the uncanny valley though. Yeah.
Our thanks again to Adam for taking the time to have a conversation with Matt and viewers.
What do you think about all this? Jump into the comments and let us know what you think about all this. How would you feel if your doctor walked into the office and said, good news, we can cure you. We have the Praying Mantis to do it. Let us know if you’d like to directly support us. You can click the join button on YouTube or you can go to still tbd fm.
You can click the Become a Supporter button there and throw some coins at our heads. Like recently, Benjamin Berger did very generously. Thank you, Benjamin for joining us and thank you for the bruises, both of those options, let you support us directly. We heal from the bruises, and then the podcast gets made.
And thank you so much everybody, for listening and watching. We’ll talk to you next time.