After falling in love with the idea of having my own factory on my desk, I finally got a 3-D printer.
The significance of having a 3-D printer is not that I simply have another toy. It isn't even that I have a new tool that will allow me to make things that I can dream up or that I will be able to gain experience in design tolerance and go from ideation to prototype. I think the real significance of my 3-D printer is that I know I am joining a movement that will have a hug impact on our economy and culture, and will probably be as big as the computer revolution. The ability for individuals to manufacture their own objects and products in their own home will drastically change consumerism and will probably have a big change on intellectual property rights when consumers have the ability to make physical objects.
I chose to get a Makerbot Thing-o-Matic because the company has heritage in building low-cost 3-D printers and because they are a for-profit company so I knew the product would be supported. I could have gotten a RepRap for significantly cheaper but decided against it just because of the challenges that I have faced in the past when working with open-source projects.
Another cool feature about the Thing-o-Matic is that it comes with an automated build platform, which allows me to print many objects in a row without my interference. I could leave it home printing all day while I am at school.
It will really let me get creative in making things, like last-minute parts for my capstone project, camera mounts for rockets, actual rockets, small design projects, and of course the rapidly growing collection of things that already reside on Thingiverse.com
Slide-show below showing the build process (in reverse order for some reason).
As an early adopter may expect the instructions for assembling the makerbot were not always as perfect as one might like. The instructions were entirely online, which is good because a paper copy would probably weigh a significant amount, and I was content to just read them off of my screen. As the Thing-o-matic has been revised and changed piece-by-piece, one could tell which instructions had been written for earlier models by their clarity. The newer instructions were quite helpful with lots of pictures, but the old ones occasionally left us scratching our heads - though usually not for long. Luckily there were user posted comments for every page of the instruction manual, and those pages were well organized.
The instructions claimed that it would take about 16-20 hours to complete. Although I did not time myself, I believe that it took longer than that for myself. Ellen Farber assisted for the entire first half of the project, and then I continued on my own. I found during the final stages that it became easy to make mistakes during assembly, and there were a few times that I had to take apart major sections. I had the body panels accidentally reversed a few times, before finally coming to a configuration that works. It was magical when I first turned it on and a thin string of semi-molten plastic emerged from the extruder.
The most challenging and the most frustrating part of the entire process was the calibration of the machine after it was completely built. This took the most amount of time. The bug that kept getting me was a 'slipping' of the Y-axis during some builds. Eventually, I discovered was because I did not have stepper motor types I thought I had, and they had been adjusted wrong.
The next challenge was spool managment. This is a serious issue until you slap enough printed parts on to the side of your printer to keep the filament under control as it guides your way in.
My biggest challenge at the moment is dealing with warping on the build surface. As it turns out the Automated Build Platform results in a significantly reduced quality of the print in exchange for automation. The belts also wear out over time, making things worse. But the absolute worst thing about the ABP has to be that you cannot level it. It is front heavy because of the DC gear-motor, so the little bit of slop in the rods throws the whole thing front heavy. I am trying to weigh my options, which are 1) Buy the heated build platform in addition to the ABP, 2) Get a titanium belt for my prints (about the same price) or figure out something else.
I actually took this photo myself.
In the summer of 2009 I got the amazing opportunity to be an intern at the Jet Propulsion Lab as a rising junior. To make it even better, I was lucky enough to be part of the most exciting, scientifically promising, and coolest space project that was happening anywhere in the world - The Mars Science Laboratory.
I was assigned as an Intern to the Surface Sampling Subsystem (SSS) and was Mentored by Daniel Limonadi and Jason Feldman. Daniel was the Validation and Verification Phase Lead for SSS, and Jason Feldman was the JPL project manager for the Sample Analysis at Mars (SAM) instrument. The SSS system includes the rovers arm, all of the attachments and appendages on the arm (including everything on the turret), the spare drill bits on the front over the rover, and the analytical instruments inside the rover which will process the regolith (soil) samples. The purpose of my project at JPL was to perform software testing for the two analytical instruments which were onboard the Mars Science Laboratory. These two instruments are called SAM and CheMin.
SAM. In all her glory.
SAM is the largest, heaviest, and most expensive instrument on-board the rover, and will also be the most capable scientific instrument ever sent to the surface of another planet. It contains a quadrapole mass spectrometer, a tunable laser spectrometer, and a six-column gas chromatograph. It really is a very impressive chemistry laboratory in a box. A box which is now inside the rover Curiosity, and will soon be on Mars.
The other analytical instrument on MSL is called the Chemistry and Minerology instrument, or better known as CheMin.
I did functional level testing of the software in two different ways. I did most of my work on a software simulation of the hardware and operating system. This system was called WSTS (Work Station Test Setup), and allows developers to quickly test software right from their desks. This is a huge benefit to expedite development. I also tested software on hardware simulations, which were kept in the test-bed and were ESD sensetive. This allowed me to compare my results, and I could not only test to make sure that the current software release for the two instruments was functional, but also evaluate the fidelity of WSTS by comparing the results.
I didn't always have an easy time with this project, I had a lot of learning to do about software testing and testing in general as I designed and developed the tests. I also needed to learn how to use a lot of tools in order to actually perform the tests. I was able to use scripts to turn command lists into batches that could run so I did not have to enter each one individually.
Photo by Scott Maxwell.
2:53 am MST
We have just entered the final hold for the Taurus XL which should, in less than 20 minutes, blast off from Vandenberg air-force base. If it does this all right, NASA's Glory aerosol measuring spacecraft as well as three university built Cubesats, including our own, explorer 1 [prime].
Over a week ago we had our first launch attempt, with a lot of tension and drama. To enter the A-train orbit, there is only a very tight 47 second launch window available each day. And if you are launching the rocket south, then it happens at about 3am. More people are here than there were last time as well, because on the previous attempt about half a dozen of the most prominent contributors were in California as well.
I am not sure what has happened, but the launch window has apparently shrunk to only 24 seconds, and there is 10 minutes left on the clock. Last time the same thing happened during the final hold, and it was a nerve wracking moment to see the launch window quickly shrink in front of our eyes. Before we were relaxed, but as we went in to the final hold it is starting to tense up. There are still many items left on the checklist, and hopefully all of them will pass without incident. The big moment will be for the item called "S-Band on" which is where the clock stopped the last time and the mission was scrubbed. We just passed.!!!
So it appears that orbital sciences successfully fixed the issues plaguing us last time.
6 minutes left on the clock, 3:30 am
Passed the final launch poll... T-5
people are getting really quiet now. Still 24 seconds left in the launch window.
Auto Sequence Start-up at T-1:15
30 sec. everyone standing
successful launch, first stage normal. 16 minutes until cubesat deployment.
And, Lo and Behold.... Orbital just can't get it together. The Faring once again did not Separate. This happened last time on OCO, but it is at least good knowing that the Taurus XL will never have to haunt us again, as this was supposed to be the final flight. Claiming OCO, glory, and 3 cubesats.
The only thing that we have keeping us together is knowing that we already had a flight lined up for Flight-unit 2. At least that will be on something that isn't terrible.
So, it is clear that the ejection charge was not able to save the rocket from destruction, and it did not go off until it went below the trees. When the pieces of the rocket were recovered, it landed about 435 feet away from the launch pad. That makes me feel pretty good knowing that it was that far away from everybody.
After studying the wreckage and where all of the pieces landed, I think that I can completely determine what happened in the few moments before impact.
We lost visual sight of the rocket in the trees, and never saw the ejection charge fire. However, from the soot and black-powder remnants left on the rocket I can conclusively determine that the ejection charge definitely went off before the rocket split up. I believe what happened at this point was instead of ejecting the forward section from the interstage and deploying the flagging to slow its descent, it instead blew the rocket apart right before the interstage, keeping the forward part of the rocket and the interstage together. The flagging, sadly, was ripped to shreds by the shock and small pieces were scattered around the area. When exposed to the cold the flagging became much more brittle than I had expected. Once the rocket was split into two pieces, the aft piece of the rocket was still composed of three layers of cans. The middle layer made contact with a dead tree, puncturing a can and separating it from the other two pieces. I reasoned this from the lack of damage on the other two layers of the aft section, and the placement of all three layers. There was also an ugly looking dead tree that was in just the right place for it to all make sense.
Captions under full-sized images.
The only piece of the puzzle that I was not able to satisfactorily figure out is what happened to the nose cone of the rocket. I tried looking around for it pretty hard, but was never able to find it. I may try again during the summer, although I think I would be even less likely to find it then.
I wanted to do something while I was home on break from school. While my initial plans were totally different and more complex, I think I can be proud of the recycled rocket.
This was meant to be a continuation of the Coors 1-X beer can based rocket body, and act as a stepping stone to the Coors V, allowing me to experiment with a transitions between can configurations that differ between levels. For example, this rocket transitioned from a level of 4 MGD cans to two levels of 3 Coors Light cans then to two levels of single Coors Light cans and finally to a bottle of Alaskan Oatmeal Stout for the nosecone. I was able to obtain some F20-4W (20 newtons average thrust, 4 sec ejection charge delay after burnout, white flame). I also collected as many beer cans as I reasonably could. This rocket became known as the "Recycled Rocket" because I used much more than just beer cans, relying heavily on cardboard for the construction of not just fins but also stage adapters and interstage. Recycled paper was used for the aerodynamic nose cone; This rocket was made out of 100% post-consumer waste, excluding the tape, epoxy, hot glue and rocket motor.
The first photo gallery shows some of the construction of the rocket. (click images for captions)
So admittedly, some of the criticisms of this project could of course be the use of glass in the nose of the rocket and the lack of a slow-recovery system. In defense of such possible criticism, the reader should be made aware, and can see in the videos of the rocket launch, that the launch range used was entirely private and massive (several miles of possible range), and the rocket was launched at a lower QE (launch angle WRT normal) to get as far away from structures or observers.
Video of the Launch:
You can see another video from a different angle here
Unfortunately the aspect ratios which I captured that at didn't allow me to edit it using iMovie 6. That's what I get for holding my iPhone vertical I guess.
It is impossible to see the apogee or descent of the rocket in these videos unfortunately, but the rocket was sufficiently heavy that it went behind the trees before we saw an ejection charge.
I made some guesses at the coefficient of drag of the rocket, and using Excel modeled a 1-dimensional flight path of the rocket. that model seems to have since been corrupted, but I think I estimated an upper limit apogee of around 300 m or more. With negligible drag in free fall, this would mean about a 4 second fall. Including the coast time of the rocket, the ejection charge was never going to fire very high anyways due to the large weight of the motor. Unfortunately adjusting the delay timing was not possible.
I will have more on the recovery and crash analysis next post.
In my first post, I described how awesome SpaceVision was and how I intended to start a SEDS chapter at Montana State. I am glad to say that since then that Montana State SEDS had a very successful first meeting, with over twenty students in attendance, from a range of disciplines.
We had a healthy mix between mechanical and electrical engineers in attendance, as well as a few business marketing and business management majors as well. We may have had some other, less represented majors as well. I cannot remember exactly what fields were represented there that night, because our first meeting was about a month ago, at the beginning of December.
We have decided upon a few goals for us to pursue as a group, and have 5 officers. (including me, president :D) Our goals for the coming semester are:
- Do a rocket project
- Host launch parties
- Have interesting speakers talk about their experience in the space industry
- Do outreach to highschools and highschool students in the Bozeman area
- Attend SpaceVision 2011
SEDS at Montana State will have a very successful first semester at Montana State and we will be starting up our meetings again the second week of classes.
This last summer (2010) I was lucky enough to be part of the 2010 NASA Robotics Academy at Marshall Space Flight Center in Huntsville, Alabama. There I was partnered with other college students from around the nation who also had a passion for robotics. Some of them were already experts in their field, and had extensive robotics experience.
I was placed on a team with three other students, and was assigned to work on a distributed swarming robotic system under Dr. Robert Ray. Our goals were to develop a swarming system based on the iRobot Create platform, and have the robots autonomously interact and cooperate with each other.
The iRobot Create is made for robotic research and experimentation, and is simply the Roomba without the vacuum cleaner, but with a cargo bay to add your own stuff and a 21 pin connector to interface with the Create. This makes it a great way to get started on a robotics project like the one we did. They are inexpensive and easy to work with.
We were trying to create a swarm of 6 robots which would be location aware, and would make coordinated movements together. This was to be a centralized psuedo-swarm that did not use "swarm / hive intelligence." Bcause in a flight environment we would never want a robot to have the ability to make it's own decisions and possibly act on its own, we had a centralized desktop computer which ran the AI and directly controlled the different robots. With this important decision made, we had a number of goals to accomplish, which included:
Wireless communication between the robots and the computer
AI for all 6 robots to move independently and simultaneously
Location determination for all of the robots
Our swarm of six robots with Kat Blackburn (programmer) in the background
One of the iRobot Creates with extra electronics
My responsibilities as the mechanical engineer for the project was to design and create the integration of the electronics which allowed wireless communication between the central computer and each robot.
Mechanically, I had to get the electronics mounted onto the robot. This included the Freescale "Tower" which contained the real-time processor and a board with a network adapter. Also there was the wireless router / adapter, and the voltage regulator. Early models needed a laser level as well.
Electronically, I had to design and make cable adapters to transfer power and data between the robot and the added electronics. I also had to design and make the cables that connected all of the various components. The space available inside the robot was tight, and I attempted to make the profile as low as possible, which made routing the cables more challenging as well, and required creativity so they could fit.
Being able to build each robot allowed me to work hands-on with the hardware rather than on the computer. Actually putting it together also provides the feeling that you are making a lot of progress and is a great thing to do whenever something else is frustrating.
The biggest challenge was doing location determination for the robots, many techniques were tried, but in the end none of them worked.
The first approach was to use lasers, which fanned into a line on each robot, and light detectors in different spots of the room to trilateralate the position of the robots, but we could not find lasers who matched frequencies with the detectors or vice-versa. Additionally the laser on each robot was not very powerful, so within a few inches it would no longer pick up the signal, even in the dark. About the same time I was pursuing this technique I encountered a lot of problems with signal processing on the computer side as well, and the difficulties of trying to use Labview in real time to detect pulses. Labview is great for a lot of things, but I could not be great at identifying signals and processing them in real-time.
The next approach was to use the diode "walls" which came with the iRobot Create. The thought was that we could detect the walls ourselves with the IR sensor on the Create and determine the location of the Creates as the direction of the IR beam spun around the room. In theory it would work great, but there were more difficulties.
The difficulties which this ran into was the ability for the Create to send data back to the computer, which had a lot of problems with interference chopping the packets into pieces. The second problem was the range of the walls, which could not be made good enough to determine their location within a decent range, although it was better than before. Precision had always been a problem with this approach before as well. I tried to collimate the IR LED as best as I could, but it would never be good enough for what we actually wanted, which was plus or minus a few centimeters at a range of about 10 meters.
Location detection was never something that could be implemented in the robots, and by the end of our 5-week project we were unable to make them location-aware, although the other goals of coordinated movement and wireless communication were met.
As I waited in the airport for my flight out of Huntsville, I was talking to a member of the all-graduate student team which was also part of the NASA Robotics Academy about my frustrations with the hardware available to us as interns and the difficulties of my project. I told him that I thought the best way to press forward in the future was to use sonar to track the distances and trilateralate the positions of the robots.
He told me about an amazing system which is called CRICKET, developed by MIT to ultrasonically locate the positions of robots with centimeter accuracy in an indoor environment. As it turns out, they developed exactly what I was dreaming about.
I think the biggest lesson I learned this summer is that even if it seems like time is very limited and you must charge ahead down one path that looks easy, it is good to ask yourself if it is the best way to do it, and approach it with a somewhat cynical mind so I can identify problems earlier. It is also good to seek help whenever encountering a problem, especially from those more experienced than you, even if you think you have it figured out, because you just might not.
Despite the setbacks in location determination, using dead reckoning we were still able to get the robots to move together, both independently and simultaneously. This project succeeded in all other areas of development, and was a Sucess. It may be continued by robotics academy students next year.
After finishing Afternoon Delite
, Ellen Farber
and I soon decided to move on to another summer project.
I don't know exactly how this idea was created, but it was decided that we should make a rocket out of old coors light cans. Yay recycling! It was originally proposed that we create the Coors V, in commemoration of the Saturn V. This project also required a lot of running around to hardware stores at home depot, trying to find the best place to buy the epoxy that we wanted, and buying the necessary supplies. We used an E-9 motor because that was the largest we could find in nearby hobby stores, and managed to adapt the motor mounting kit to the inside of a can with the top and bottom removed. When the motor was selected we realized that we had to scale back the rocket so Instead of a 5 of 7-can configuration we went with a single can stack. The project became a test bed for what will eventually by the Coors V rocket.
Engine : E9
Body: Coors light cans x3
Parachute: estes rocket kit plastic chute
Nosecone: Coors light bottle
Altitude : 426 ft
The nose-cone and entire forward section were separated during the ejection charge because we had not been able to securely fasten the shock cord to the can. Video coming soon.
The Coors V will require about 56 cans. I'm working on it.
While at the NASA Robotics Academy this summer, one of the activities we did was attend the Alabama Space Grant Consortium's High-powered Rocketry Workshop. There we learned about the University Student Launch Initiative (USLI) competition and all of the various facets for starting a team and competing. That all sounded like a lot of paperwork, and as everyone there was primarily a robot nerd, were just excited about getting to build big rockets and get National Association of Rocketry (NAR) High-powered Level 1 certifications.
We got these really great starter kits by MadCow Rocketry called the Patriot, and had a few days to build and paint them. No one on our team seemed particularly excited about the prospect of painting the rocket, which we believed added little intrinsic value to the rocket other than reduce drag.
We felt that we could do better than just build some dumb rocket that went up and came back down. We wanted to do something cool. We wanted to do something different. It was no coincidence that we came to this unanimous conclusion just minutes after the conclusion of a talk by the legendary Tim Pickens.
Tim Pickens Has been active in amateur rocketry for about as long as anyone can remember and made himself well known for his daring endeavors, brilliance and quality rocket design. He became the lead propulsion engineer for Space Ship One, developing the hybrid motor that won scaled composites and Virgin Galatic to win the Anasari X-prize. He then went on to start orion propulsion and made a ton of money doing rocket engine design and support for military, NASA, and commercial sectors. Recently (since we have last seen him) He has been leading the Rocket City Space Pioneers in their quest to win the Google Lunar X-prize. He is also well known for his fun and zany home projects, many of which include rocket propulsion.
We came to the conclusion that we could put a video camera inside of the rocket and film the ascent and descent . After much shopping and deliberating, we decided to use an Olympus point in shoot that was owned by me (YIKES!). We had our work cut out for us in making this relatively simple project happen. It took many trips to home depot and other stores in Huntsville to find parts that would work well, fit well, and be light enough for our rocket. The most challenging part of this project was planning the assembly as we had to construct the payload bay after we had already assembled the rocket. This required adding another bulk head, camera mounting, keeping the nose-cone securely attached, payload interconnect, another shock cord mount, and hole for the lens. We also needed to create a much more complicated checklist and pre-flight procedures.
Luckily our project was a success and we had a safe launch with an H motor.
The camera is not really made for shooting video and actually kind of sucks at it. I need to get a better video camera for the next time that I launch this. It also had a difficult time focusing due to the edge of the hole being visible by the camera, and the rapidly spinning terrain. When the parachute charge went off. The camera went black but continued to record sound. We believe this may be from the mechanical shutter closing due to the sudden jerk, but all theories are really just speculation.
Afternoon Delite is in my room right now and is awaiting the spring, when it will fly again.
The body was cut at an angle because we realized that we could not make a straight cut, and an angled one would be easier to align later.
It was a crazy weekend. So much stuff happened at SpaceVision and I met so many really cool and influential people in NewSpace it is kind of mind-boggling. I met a lot of people there who are giants in the industry, whose names can be mentioned in a lot of settings and almost everyone knows who they are. I did not meet enough students from other parts of the country, but there is simply a fundamental limit to the number of people I can remember from any given weekend.
Anyway, all of the speakers were pretty amazing and I had such a good time seeing all of my NASA academy pals there. I have come away from there with so much enthusiasm and energy that I just want to graduate right away and start making spaceships. Or go to the moon.
One of the things that I realized I had to do there was get my name out there, and start documenting all of the projects I have done and will be doing - online.
Another thing that got me really excited about while I was there was learning about SEDS. Before I arrived I was not even aware that it stood for the Students for the Exploration and Development of Space. It may be a testament to how cool SpaceVision was that before the weekend was over, I had dedicated myself to creating the Montana State University SEDS chapter. We have a large student base who is interested in Space and a number of faculty who has had a lot of experience in the space industry, so I think we can have some very interesting presentations. I am also not the only one who decided to start / restart a SEDS chapter after being at SpaceVision. www.efarber.com will be leading the charge at Harvard to re-start SEDS there.
I know that SpaceVision 2011 will be amazing and I plan on being there in Boulder. I got to meet a lot of people from CUSEDS and it is going to rock my socks.