top of page
Search
  • Writer's pictureAlex Janis

Week 5: Spider-Web Design

Updated: Aug 9, 2019

This week, the parts for the TorsoBot came in and we reassembled it. We shifted our focus back to the original goal of the project, making the leg sit under the torso and give short, quick pushes. First, changed the start and end angles for the legs to create the short pushes we wanted. Also, we finally added a keyboard interrupt so that we can stop the program. However, we still needed to make the push-offs faster.

Dr. Adamczyk wanted this new progress on the TorsoBot to be put on the lab website. He assigned me the task of designing a new subpage of the website for it. Although I had some experience with creating and websites (like this blog for instance) I had never worked with Word Press (the software used to make the lab website). I started making a video on my iPad of the various tests of the TorsoBot we had done to make a progress video. This went very well, but I am wanted to wait to publish the video until the robot was fully completed. Although it had made leaps and bounds, it still relied too much on leaning forward. I refocused my efforts on writing short and easy to understand paragraphs of each of the robots worked on by BADGER Lab. This included the TorsoBot, the WobbleBot, and the robotic arm.


I practiced driving the pacer robot and realized I am pretty good at it. We put one of the intern's phones on the front while we drove it to see what it looked like. It was pretty cool, albeit the camera was very shaky:

I continued working on the website and the video of the TorsoBot, ultimately making this lovely montage video of our progress. You can see it in my other blog post entitled “TorsoBot Video” or by going to the lab website.


We decided to move forward with TorsoBot and run some tests using the motion capture cameras. After removing the tail and adding the motion capture markers, I simply tested out different speeds in the settings and collected the data. The grad students taught me how to use the motion capture cameras and the recording software called Motive. It was very cool, you can see little dots where the markers are. They also had me calibrate the cameras. To do this, you take a stick with three markers of a known distance apart and wave it around in front of the cameras. Once this is done, you take an L-shaped object with more markers and place it on the floor so the cameras know where it is.

I tested TorsoBot at speeds of 0.4, 0.5, 0.6, and 0.7. Once I recorded the data, I trimmed the motion capture video to only contain the steadiest parts of the TorsoBot's motion. It was cool to see the markers from the TorsoBot moving around on the screen in 3D. Then I exported the data to a program called MatLab. This program is for processing large amounts of data. One of the grad students, Thomas Laidlaw, had already written a program for processing the data. From the motion capture video, the program found the angle of the torso and the speed of the robot. Then, it graphed those two things as a function of time. We trimmed the ends of the data set that had irregular spikes and then the data set was done.


After collecting all our data, we compiled them into one graph. There were a few outliers that we had to remove, but otherwise, the data was great.

1 view0 comments

Recent Posts

See All

Week 9: Final Week

I began a new project of transferring Dr. Adamczyk's handwritten notes into a completely textual Pressbooks format. The notes contained...

Comentários


Post: Blog2_Post
bottom of page