All Projects

Name

Skills

Year

Prosthetic Foot Test Rig

April 2024
Developed a prosthetic foot test rig that simulates a gait cycles distributed loaded in order to asses prosthetic feet in how well they mimic the biomechanics of an anatomical ankle. My team and I are currently furthering this project to validate our test rig to collect reliable data.

Project Overview

For my Senior capstone project, my team and I were matched with a researcher at Tufts Medical Center to develop a test rig to asses prosthetic feet for a quantifiable standard that our client aims to establish.
Currently we are validating our system after considering the loading condition limitations of our load cell.
Free body diagram for  force and moment calculation

Background

Current testing standards for prosthetic feet fail to consider how well prosthetic feet mimic the biomechanics found in anatomical human ankles in a natural walking cycle. To account for this, our client, Dr. Mark Pitkin, seeks to establish a measurable standard known as the Index of Anthropomorphicity(IA). For prosthetic feet, the IA quantifies the concavity or convexity of the moment-angle relationship in a prosthetic foot. This has to do with the moments required to deflect a prosthetic leg to a corresponding angle of deflection.  

In order to establish this standard, a means to reliably and accurately measure moment-angle data in a prosthetic foot during a walking cycle is necessary. Our goal for this project was to develop a testing rig for prosthetic feet that applied axial and tangential forces to simulate the dynamic loading seen on a foot during a walking or gait cycle and measure the moment generated at the ankle at a corresponding angle of deflection to characterize the IA for a range of prosthetic feet.

Design

We went through a lot of iteration for this design. There were 4 main concepts that we designed for:

  • Foot clamping mechanism-needed secure the foot down + allow for testing in sagittal and frontal plane
  • Axial loading-needed variable loading with a control to simulate body weight in a gait cycle
  • Bending force-needed to measure the perpendicular force required to cause deflection to calculate the moment about the ankle
  • Angle Measurement-prosthetic feet bend at the ankle unpredictably, and required special consideration
Design brainstorm session focused on angle measurement
Proof of concept prototype sketch
Final concept sketch

Fabrication

We fabricated our design from cut 20/20 aluminum extrusion and machined aluminum. We programmed the test rig using an ESP32 S3 and used linear actuators to apply the loads and used load cells for our controller and to calculate the moment about the ankle. Finally, we used an encoder on a sliding gantry and a linkage system to measure the angle of deflection.
Proof of concept prototype
Load cell calibration with an Instron
Measured and cut 20/20 aluminum extrusion
Deep into milling the main housing component
Initial wiring mess, documenting wiring

Final Design

Analysis/Results

IA characterization of MF foot
Moment angle behavior in a normal ankle from [1]

These are some initial results from our test rig. We tested the Willowwood Meta Flow foot which was designed to have a more compliant behavior that mimics the behavior seen in anatomical ankles. As can be seen in the figures, a normal anatomical ankle exhibits a concave curvature, similarly to the MF foot.

Looking further into our results specifically, we recorded the axial forces that we applied and the bending perpendicular force at a corresponding angle of deflection. From this we were able to calculate the moment generated about the ankle and normalize based on a simulated human body weight that we input in our program.

Detailed results in IA characterization of MF foot
[1] M. Pitkin, “The Moment Criterion of Anthropomorphicity of Prosthetic Feet as a Potential Predictor of Their Functionality for Transtibial Amputees,” Biomimetics, vol. 8, no. 8, p. 572, Dec. 2023, doi: https://doi.org/10.3390/biomimetics8080572.‌

Pressure Assessment in Wrist Splints for CTS

February-May 2024
Developed a sensitive, non-invasive, and cost-effective means of capturing relative pressure forces on the interior of a wrist brace intended for Carpal Tunnel Syndrome (CTS) treatment to assess the effectiveness of various splint designs

Project Overview

In this study we developed a cost-effective and non-invasive means to record compression on the carpal tunnel by wrist braces intended for nocturnal splinting for Carpal Tunnel Syndrome in order to consider how different rigid splint designs impact relative pressure distribution around the wrist

Background

Carpal tunnel syndrome (CTS) characterized by compression of the median nerve in the carpal tunnel of the wrist causing symptoms of pain and numbness in the hand and forearm. From 4 years of arts high school followed by an engineering school and a number of repetitive hands on hobbies on the side, I have developed early CTS myself and when my team and I were considering problems in our day to day lives to focus our project on, I brought up the lack of effective treatments for early CTS.

The only real noninvasive treatment for CTS is wrist splinting, where the idea is that studies have shown that internal pressure of the median nerve is at its lowest when the wrist is maintained in a neutral position. That being said, studies have also shown that the effectiveness of using a wrist splint as a treatment for CTS is unclear.

Knowing this, we considered the idea that wrapping a splint around the wrist leads to external compression of the Carpal Tunnel and negating effects of holding the wrist neutral. We also consider the impact of different rigid splint insert designs on pressure on the Carpal Tunnel relative to other proints around the wrist.

Methods

There were two main components to this experiment that we developed. The different rigid splint insert designs we were testing, and the wrist splint that we modified to record pressure at points on the wrist. For this we used a commercially available wrist splint marketed for CTS treatment.

The Splints:
  • The wrist splint that we used came with a rigid splint insert that we used as a basis for our designs to ensure fit
  • We designed 4 different geometries for our splint, maintaining the similar stiffnesses by using thicker gauge aluminum for skinnier geometries
  • We waterjet cut our designs and bent them ourselves, recreating the original splint to maintain consistency in manufacturing
Data Recording:
  • Given our budget, we used small and slim Force Sensitive Resistors (FSRs) to record pressure data
  • FSRs were positioned as shown in the following diagram and I installed them using sewed tacks
  • We used an Arduino Uno to collect the data as analog values packaged in a message and developed a LabView Program to unpack this message plot this data live and save it for post processing

Results

We tested our 4 different splint designs while the wrist was held still and while in motion. The graphs below are the results of one trial while testing the control design splint. The graph with an oscillating pattern is the trial with motion. We then found the average value of each FSR and plotted them against each other, understanding that sensor 4 was the most concerning sensor, positioned right over the carpal tunnel. We concluded that the most optimal design with the one with no bend below the wrist.

Printing Press

April 2024
Modeled a fully functional etching printing press using Solidworks and animated and an assembly

Project Overview

For my Computer Aided Product Design final project I was tasked to make a model and assembly of anything I wanted in Solidworks. I decided to be a bit ambitious for this project, but these were the few requirements:

  • At least 5 unique components (not including standard components)
  • Some type of dynamic element
  • Parts include minimal complex surfaces
  • Reasonable level of complexity in the assembly/component

Deliverables

For this project I needed to model each component and create an assembly, create an exploded view, animate the collapse assembly and its dynamic element, create part drawings for 1-2 components, and create an assembly drawing with the exploded view.

I decided to model a printing press since I've loved printmaking since high school and I have been thinking about building one for myself recently. I figured that I could use this project to plan it out. After some research on different press types I found this blog tat detailed how one person managed to build a press for themselves by hand:https://www.instructables.com/Build-a-Printmaking-Press/
I ended up to follow this blog, but it proved to not be super detailed and led to me having to make lot of design choices, which was not the initial scope ofthe project, but was no real issue for me.

Robot Cafe

April 2024
Developed a fully robotic cafe that was able to take orders, make lattes, and deliver the latte to the customer.

Project Overview

In my Intro to Robotics class, the final project tasked us as a class to develop a fully automatic robot cafe implementing skills from the course:
-Linkages
-Gears
-ROS
-APIs
-Image processing

Approach

Since this was a self led class wide project that surmounted to 25 people, it was important to us that we delegated tasks and components of this project appropriately. We ended up dividing ourselves into 4 general teams that varied in size by intensity/demand: Coffee Team, Milk Team, Art Team, and Transport Team.

Overview

The system works as follows:

  1. An Airtable is used to track progress of the system
  2. Customer orders a latte at the Ordering Station, picking a size and latte art stencil (image processing is used to identify the selected design) and the info is logged into the Airtable
  3. At the Cup Station, using solenoids, a cup is dispensed on the platform attached to a Create 3
  4. Create 3 rotates the empty cup into the Coffee Station where the lid opens, the pod is dispensed into the machine, the lid is closed, the espresso machine is turned on and dispenses espresso into the cup
  5. The cup is rotated into the Milk Station where the milk frother has already been filled with milk via a pumps and has been frothing since the order was set and tips the frother to pour the contents into the cup
  6. The cup is rotated to the Art Station where the correct stencil has already been oriented over the cup position and a solenoid is used to tap a dusting of cinnamon over the stencil and onto the frothed milk
  7. The finished latte is rotated into the Customer Pick-Up station, awaiting for the customer to take the cup using an ultrasonic sensor

Coffee Subsystem Process

I worked on the Coffee team, working directly with the espresso machine and making augmentations. Here is an overview of the subsystem:
  • Since we had a larger group and progression of tasks, we spit our into 3 sub-teams: Lid Team, Pod Team, and Button Team where I worked on Lid Team
  • Coffee team had our own Airtable to communicate status across sub-teams
  • Pod team used a linkage system and a  3D printed channel to reliably push pods into the slot of the machine in the right orientation
  • Lid team used a linear actuator as a linkage to open and close the lid
  • Button team used relays to connect to the buttons of the machine and trigger the correct actions using values in our team Airtable. Button team also created the ordering station for the customer to choose the size of their drink

Lid Team (Individual Contribution)

I worked on the lid team because I thought the mechanical challenge of the force necessary to puncture the pod was be interesting. We initially considered a linkage system and a gear train to achieve the necessary force, but we ended on a much more elegant design, using a strong linear actuator in a simple linkage to open and close the lid.

To fix the end of the linear actuator onto the tip of the lid and since the lid itself was a solid metal material we needed to develop a strong connection link. After much iteration we landed on a custom clevis mad up of two pieces of waterjet cut aluminum sheet bent around the lid and held in place by a notch sawed out of the lid. The linear actuator is attached with a threaded rod fed through the holes and held in place with nuts.

Since this was a pretty load heavy process, it was important to  fix everything in place firmly. We also developed a lofted base to prop up the espresso machine to an operational height for Transport Team. We used a series of 80-20s to mount the linear actuator and then bolted the whole system to the plywood base we were working on.

Results and Areas of Improvement

We were able to successfully make a latte for a customer, though not without a few issues. Button team ended up having some last minute issues with relays which resulted in some manual value updating during the demo. We also intended to be able to make 3 lattes in sequence for the demo, but after the first successful cup, the pods in the espresso machine jammed, leading to the hinge of the lid failing under the force of the linear actuator. This was the unfortunate end to the Robot Cafe, since there was no easy fix.

I do think there were a couple of things that we could have done to prevent this. While working on this, my sub-team and I debated implementing limit switches to ensure that the lid was being opened and closed all the way. Since we felt like we were constantly dealing with different issues,  this ended up not being prioritized as it should have. The pods in the machine ended up getting jammed because the lid did not open far enough to drop the used pod from the first latte and I do think limit switches would have prevented.

That being said I am still very proud of the results of this project. I think the team coordination of this project was very impressive given the scope and I'm very proud of my team and how it came together.

Object Based Maze Runner

March 2024
Programmed an iRobot Create 3 to recognize a set of objects in order to navigate a maze autonomously using Google's Teachable Machine and a PiCamera.

Project Overview

Using a Create 3 robot from iRobot, we developed a program that would control the robot to recognize a number of objects and make a specific turn based off of the object recognized in a maze. Specifically, the robot was supposed to make a 90 degree turn 6 inches away from each object. We would not know the orientation of the object and we did not know which direction each object would have to indicate until 10 minutes before testing.

Set Up

For the object recognition aspect of this project, we were tasked to use Google's Teachable Machine to develop a model to recognize each object. There were a total 7 different objects that needed to be catalogued using the teachable machine and they are all shown in the following image. In order for the robot to actually see the object we were to use a PiCamera that we would need to develop a mount for. In addition,  since the robot needed to stop at 6 inches from the object, some kind of sensor to measure distance needed to be implemented. Ultimately this left us with using either an ultrasonic sensor or the built in IRsensor on the robot.

Approach

The first course of action was to build a mount for the camera. For this, we developed a laser cut piece to secure the Raspberry Pi down and the PiCamera into a fixed position.
From there we took many of photos of each of the objects at different orientations and lightings with the PiCam to upload to the teachable machine. After training the model, we exported it as a TensorFlow OpenCV Keras model which we could work with.
Programming the actual motion in ROS involved spinning with a RotateAngle action and a driving with a Command Velocity topic. We also decided to use the IR sensors built into the robot for simplicity to detect distance.

Results

We were able to get our robot to respond pretty well to each object. We did run into issues of the model often confusing the bear with the small Mario figure and to try and amend this, we tried taking more pictures of each object, but this did not seem to make a difference and this appeared to be a common issue across other teams.
We also ran into a bit of a roadblock when we realized that the positioning of the IR sensors on the robot made it so that at 6 inches, some objects were too short to be caught by the sensor. We also realized that since it is an IR sensor, it simply could not detect all black objects since the black color was absorbing all of the IR and not letting any of it bounce back to be picked up by the sensor. To work around this, hard coded into the program that if the model recognized any of these objects, track how long it has been recognizing the object and at a certain point, about 6 inches, stop and turn. This ended up working pretty well and we considered just using this for all of the objects, but we stuck wit the IR sensors since were proud of getting it to work out.
Final Code
Image of iRobot Create 3 with phone mount and phone

Create 3
Robot Controller

March 2024
Programmed an Create 3 robot to be controlled via Airtable. Remotely navigated the robot through a maze, connecting through a Zoom call to gain visuals.

Project Overview

Using a Create 3 robot from iRobot, we programmed it to be controllable using an Airtable. With this, we would be able to control the robots movements from anywhere, so long as we have access to the Airtable. For this project, we tested our robot by driving it through a maze, controlling it from a different room and visualizing the robots position with a phone on a video call simply mounted on the robot.

Set Up

An iRobot Create 3 was used to give us a baseline for reliable motion and for its extensive sensors that come built into the robot that we could tap into.  The Create 3 is entirely based on ROS 2, so we needed to develop a program to create a node that takes values from the Airtable and publishes them to a topic to tell the robot to move or turn.  We also needed to develop a mount to prop up a phone at an ideal angle to see ahead of therobot well since we could not practice or see the map before the demo.

Approach

This project was mostly a programing challenge to get some good applied experience with ROS 2 for the first time. iRobot has a lot of documentation for the Create 3 since it is for educational use, but since we had no experience with ROS 2 and documentation on how ROS works is not too extensive, this was a bit of a challenge.
Since the robot itself is already built, we just needed to quickly put together a phone mount that we ended up laser cutting.  An assembly of the mount is shown. We also used small suction cups that stick directly onto the screen for good stability while the robot is in motion.
SolidWorks screen shot of phone mount

Results

We ended up with 2 rows in our Airtable to plug in linear and angular velocities. This lets the driver have a bit more control, changing how it moves forward separately from how it turns, letting it rotate in place if need be.
Since the Airtable only really controls velocity in the linear and angular degrees of freedom, and not distance or angle position, it is hard to control in general. We also noticed that there was a pretty noticeable lag response when changing the linear and angular velocities, that seemed pretty unavoidable. Additionally, we had issues getting our robot to turn in the right hand direction initially, so we ran through the demo turning all the way around to turn right, but this ended up being resolved after an update got pushed for the Create 3 that let us inpt negative values.
Final Code
Final built camera line follower robot

Line Follower:
Camera

February 2024
Upgraded the line follower robot to use a camera instead of a color sensor  to follow the contours of the tape, using PID control to correct for error continuously.

Project Overview

Building off of the color sensor line follower, we implemented a camera to track the line, knowing that the color sensor was rather unreliable. We were able to implement Proportional Integral and Derivative control dictate the robots movement based off of where the camera sees the track to be. Since we ran into much less issues with this camera, we were also able to allocate the time to decorate our robot as Lakitu, the referee in Mario Kart.

Set Up

An image of the tracks that our robot was challenged to follow is shown here. The black line to the far right was the biggest challenge with all of the sharp right angle turns.
We were two cameras that we could use, but we had issues with one of them, so we just used one. We maintained the same partners from the color sensor line follower so that if we did not want to rebuild entirely, we did not have to.
Camera line follower track

Process

Due to some mechanical issues from the last iteration, we decided to essentially do a full redesign. A SolidWorks assembly of this new design is shown here.
We added a more secure motor mount so that the motors themselves had 0 degrees of freedom, we added a little stand to sit our Raspberry Pi on, and we made a piece that sticks up and out to mount the actual camera and position it where it could have the best view.
Upon trying the camera and line detecting for the first time, we noticed a drastic difference between the camera and the color sensors reliability, and getting the robot to identify the line accurately was rather straightforward, and we could put more of our focus onto implementing PID. 
Solidworks screenshot of Camera Line follower robot assembly with one side wall missing to show internal components

Results

We were able to get the robot to follow the red and purple lines rather well and pretty consistently with good speed.
We noticed that the camera and our current image processing settings made it hard for the program to recognize the contours of the green and blue lines, and so they were not able to follow those lines. We do think that those lines could be feasible with just a little more time. We also were not able to do the track with the sharp turns, and we attribute this to our robot as a whole being pretty clunky and the wheels being so far from the camera, but we did have ideas for how to get around this given more time.
Ultimately, I was really proud of how this robot turned out. I think it performed rather well.
Final Code
Color sensor line follower robot

Line Follower:
Color Sensor

February 2024
A robot that uses two color sensors to track the white paper surface bordering a colored tape track, using PID control to correct for error in the robot's motion.

Project Overview

In my Intro Robotics & Mechanics class, we were paired into teams to build a line following robot using a color sensor to track the line. We were also tasked with implementing Proportional Integral and Derivative control minimize error using data from the color sensor. Our robot needed to be able to follow one colored line of our choosing in a predestined set of tape tracks

Set Up

An image of the tracks that our robot could follow is shown here. We could pick and choose which ever one we wanted the robot to follow and for an extra challenge, we could have it be able to follow multiple lines, or switch into different colors in a controlled way.
We were also given two color sensors to be able to track the line. These color sensors ended up being pretty unreliable to work with, and though we were given some basic code to make it work, further work needed to be done to make it read the colors correctly.
Tracks for the line follower made of large white paper and lines made from different color masking tapes.

Approach

Understanding how the color sensor worked, we early on decided we would benefit from some sort of black box that would go around the color sensor. With this we could use the LEDs in the color sensor to provide consistent lighting.
We decided on the general lay out shown with two color sensors to border the edges of the tape, tracking the white paper, and turning when it reads the tape color accordingly.
After testing the color sensor with a laser cut box, we still noticed some significant inconsistencies with the values from the color sensor. We decided to develop a function in our program that would take in a large number of values from the color sensor, eliminate the extremes and find an average as an output. This allowed us to identify the different colors pretty accurately and consistently after determining the color values of each tape.
Initial sketches of how we planned on arranging components and how we wanted the robot to be driven

Process

We laser cut the boxes to hold the light sensor from a black acrylic, and a piece from a clear acrylic to hold the color sensors at a fixed distance so that we could line up the tape with the gap at the start. We 3D printed and laser cut the wheels to fit onto the motor shaft and laser cut motor holders and a base to house all the components. The Solidworks assembly is shown. We also developed a PID controller to have our robot turn proportionally to how much it detects an error.
We were continually fine tuning the color values throughout the process and eventually had to settle with something that was just good enough. We also had some issues with our motor stalling when going slowly, and to compensate for this, we decided manipulate the overall motion of the robot, to allow it to make mistakes by backing up a bit with each step.
Solidworks Assembly of the color sensor light follower

Results

We were able to get our robot to follow the blue line pretty effectively and with a pretty good speed. During the final in class testing, it performed pretty well and a video of it is shown. We had it programmed to be able to theoretically follow any line, but we will need to fine tune its known color values first.
If we hade more time to work on this we would definitely aim to get it to follow one of the other color lines and we don't imagine it would take as much time now. We also would have liked to decorate it better had we had more time to work on it, since right now it is just a wooden box with wheels and electronics stuffed inside.
Final Code
A gear train driving an arm that lifts a laser cut Mario character

Double Jump:
Gear Design

January 2024
For this project, we developed gear train to reduce a full 360 degree rotation into just 90 degrees without using premade gears or software to create them for us.

Project Overview

In my Intro Robotics & Mechanics class, we were paired into teams and needed to develop a gear design where running a stepper motor for 200 steps would move a character from 90 to 180 degrees. This design needed to use at least 4 gears and each team member had to design at least 1 gear. We declared a successful design to be +/- 50 degrees and a exemplary success to be +/- 5 degrees.

Approach

Since me and my partner needed to design at least a gear each, we decided to do two 1:2 gear ratios with a compound gear to get us to a final 1:4 ratio. With this, we both would design two gears that would only mesh with themselves. This way we would not have to go back and forth comparing each others gears to make sure they would mesh correctly. To be safe and avoid interference, we decided to each do one 16 teeth and one 32 teeth gears.
We used a gear generator to quickly develop our plan and Onshape to model the gears themselves.
We decided for fabrication, it would be best to laser cut all of the gears and mount them on dowels that would be supported by a laser cut stand. We alos decided it would be best to develop an arm that would move the character.
Initial gear design from gear generator.comScreenshots from Onshape of my gear designs

Results

Fabrication and assembly was rather straight forward and our gears seemed to mesh pretty well right off the bat. We quickly laser cut and etched a Mario character and just fastened it with a loos nut and bolt so that it would stay upright. During fabrication we also decided to add a calibration feature that would ensure that our character was aligned at 90 degrees at the start. We did this just by wedging a limit switch in our stand at the right position underneath the rotating arm. With this addition we did need to include some spacers, but it worked out rather well. Since we only used 3mm acrylic for our gears, we did have some concerns that our gears would misalign and not mesh because of this, but it performed well during our tests and for the demo regardless.
Final Code
The KEN robot car with three large monster truck wheels with added rubber bands for traction, a raspberri pi and protoboard with wires sticking out on top of the bed and a guiding beam towards the back of the car.

KEN:
An Intrepid Robot

December 2023
KEN, named after its makers, Kimberly, Ed, and Nezy, is a robot designed for loaded autonomous straight ramp ascension.

Project Overview

In my Electronics and Controls 1 class, for our final project, my team and I were assigned a 2 part challenge. For part 1, the goal was to build a robot to simply travel up a given ramp with a controller. Part 2 of this project was to modify our robot to autonomously roll a large cardboard tube up the ramp with the help of another team's robot without letting the tube fall off the ramp.
Demo Video

Set Up

The ramps that our robot climbed and the set up with the tube are described in the figures shown. They were built out of birch wood and were available to us throughout the building process for testing.

For part 2, grip tape was added to the surface of the ramps to aid in traction. There were two ramps available to test with other teams and collaborate while fabricating. We were also provided the tube that we would be using for the actual presentation to test with.
Schematic of ramp
Sketch of ramp set up

Goals

Part 1

  • Should fit in a circle 45 cm in diameter
  • Should be able to ascend the ramp without falling off the side
  • No one should touch the robot during its adventures. Should be remote controlled.
  • Should be controlled through wifi from a laptop or phone.
  • Cannot fly. (We don’t have the space to test drones safely, unfortunately.)
  • Does NOT need to turn in arbitrary directions. Should be optimized for straight ramp ascension.

Part 2

  • Should still comply with all the constraints from part 1
  • Should receive only one signal from a human: the click of a button to begin operation.
  • Should respond to two URLs: /start/<delay> and /target/<speed>
  • Should only make requests where delay is an integer in the range 1-10 seconds, and speed is an integer in the range 1-1000 mm/second.
  • It is the shared responsibility of both robots to control their speed and the tube angle to shepherd the tube to the top of the ramp.

Electronics

In order to control the robot with wifi, a Raspberry Pi 4 was used. For a simplification one 12 V motor was used to drive the front wheels and a MOSFET was used to control the motor speed using PWM on the Raspberry Pi. Additionally, for part 2, the robot needed away to sense whether or not it had a hold of the tube, and for this we used 2 buttons. For our final robot, we used a protoboard for more secure connections.  
Circuit diagram of robot with a MOSFET to control a 12V motor and two buttons with pull down resistors.

Ideation - Part 1

Initial sketches are shown here. In order to design for straight ramp ascension and with an understanding of how finicky fine tuning the provided motors is, we decided to arrange the two front wheels onto a shaft that would be driven by the motor using a large gear ratio. This would mean that any difference in power between the wheels would only have to do with traction. This also meant that our robot would be plenty strong with a large gear ratio and we would not have to worry about stalling. For balance we decided on one wheel in the back to simply free spin. We also decided to use rather large and wide wheels, aiming for the most surface contact and the best traction.
Initial sketches of robot design and organization

Fabrication - Part 1

The wheels of the robot were 3D printed and laser cut. Using Solidworks, I developed a tire of sorts to be 3D printed with ridges on the outside that we could fill with hot glue for traction. On the inside of the tires are cut outs that securely hold in place a make-shift rim that was laser cut using 2 layers of 6mm acrylic for strength. Both models are shown in the figures here.
Wheel assembly
In order to firmly drive the robot, the bull gear and both wheels needed to be secured onto its shaft very firmly. In order to achieve this, and through much trial and error, the two front wheels, the bull gear, and some ball bearings were fed onto a threaded rod and secured in place using nuts, washers, spacers, and some hot glue for good measure.
The motor was secured in place with hot glue on a 3D printed motor mount that I modeled up. This whole assembly is shown in the figure here. The bed that holds these components together was laser cut.
Detail shot of robot solidworks assembly
For the free spinning back wheel, a similar, but simpler set up was applied, using a threaded rod, ball bearings, washers, and nuts to hold the wheel in place on the bed.
To house all of the components and to avoid wires getting eaten by any spinning parts, we designed a shelf to store the batteries and the motor while leaving the raspberry pi and all the wiring accessible for debugging and separate from any spinning parts.
The controller interface we developed to control the robot for the initial demo is shown here.
Robot controller interface with a wall-e theme

Part 2

The first thing we did was calibrate our robot so we could clearly dictate its speed in mm/s. Our process for this is seen in the video here.
So that the robot stays on the ramp we opted for a mechanical approach of building an attachment piece to act as guides that would push or pull the robot if it got too out of line.
To sense if the the robot had a hold of the tube or not, we used two buttons positioned at the far edges of the front of the robot. The weight of the tube would depress both buttons when the robot had a full hold of the tube. If only the one button was depressed, this would indicate that the tube had tipped one way, meaning our robot was not matching in speed with the other robot and it needed to speed up or slow down.
Since many of our components were made using Solidworks, I developed an assembly to be able to visualize the robot and make rapid changes to the files before laser cutting and gluing the real thing. This aided in developing the buttons, visualizing how the tube would interact with it, and making sure the wheels would make it onto the ramp with the additional overhang.  
Solidworks model of the whole robot assembly

Results

For the railing guides, we were not initially too sure how well they would work, so we worked more intuitively and made rapid scrappy changes with a dowel, scrap wood and PVC, hot glue, and tape for a smoother surfacing. For the buttons, we found out early on that the buttons we had easy access to were not sensitive enough to depress simply by the weight of the tube and we ended up developing a sort of button lever using small springs and copper tape that worked very well.
For our final presentation, we ended up without a means to communicate with other robots and instead adjusted the code to change its own speed solely depending on the button inputs. This worked well enough given there were not many other robots that were able to do this either.

In the end our robot was able to push the tube up the ramp with the help of another robot. With the short turn around we needed to find our priorities to make sure our robot could actually complete the task. If we had more time we would definitely dedicated it towards the communication aspect, we suspect we would have just needed just one more day.
Additionally, we noticed the wheels slipping quite a bit during the presentation when it goes too far ahead. This worked fine for the presentation since it gave time for the other robot to catch up, but this was not how we programmed it to work. We figure additional weight around the wheels would give it better traction.
Final Code
Used KiCAD to design a PCB to house an H-bridge motor controller to manufacture through Oshpark. Features labels for convenience, LEDs to indicate power connection and direction, and a silkscreen image on the back for fun.
Back of PCB
PCB Front
A slim black box with a large spiraled wire coming out of the top, purple start button, a wand, and LEDs around the boarder.

Buzz Wire:
A Lame Game

Fall 2023
Buzz Wire is a carnival style game where the objective of the game is for the player to feed the spinning spiraled wire through the loop of the wand without touching it.

Project Overview

In my Electronics and Controls class, I was tasked to use the basic electronic components that we had learned at that point to make a very simple game that is, "at least mildly entertaining." As this was early on in the semester and we did not have too many tools under our belts, there were few requirements:
-Use the DC gearmotor provided
-Require user interaction of some sort
-Fit inside a cube 20 cm on a side

I ended up taking this project a bit further than the initial requirements and remade it with a few upgrades, namely making it look nicer, implementing a start button and buzzer, and making the wand cordless.
Demo Video

Ideation

My very initial idea was to make a carnival style buzz wire game where the goal is to feed a wire through the loop of a wand without the two pieces coming into contact with one another. In order to agree with the requirements of the project, I also used a motor to spin the wire, adding another factor of difficulty to the game. I wanted to keep this iteration pretty simple. The plan was to use an open circuit with one end being the actual spinning wire, and the other end to be attached to the wand. This open circuit would also be connected to red LEDs wired in parallel in order to signal to the player that they lost the game.

First Prototype

This was the result of my first prototype. I was able to 3D print an attachment that connected the bottom and of the wire with the shaft of the DC gear motor. I also developed a custom box design that I was able to laser cut and it included the hole for the main spinning wire to come out of, a hole for the wire connecting the wand to come out of, and holes to feed the legs of the LEDs through.

For better playability, I also used a bit of heat shrink tubing in order to crate a bit of a safe zone at the very beginning of the wire and I did the same for the base of the handle in the unlikely scenario of built up heat generation, or a wiring mishap. Additionally, since the motor that I was using went much faster than ideal, I also wired it up with a MOSFET and programmed it using PWM in order to slow it down to a playable speed.
Initial prototype of the game with a larger wooden box base, a smaller spiral wire, an attached wand, and only 4 LEDs.

The Issues

This prototype did work, though it did not come without its issues. The primary issue was that this game was still very much connected to my computer and the motor just continuously ran. The only indicator that a player lost, is the LEDs. Additionally, since the motor was relying on PWM to slow it down so drastically, the motor would often need a bit of a helping hand in order to start spinning. Given the aim of the game and the current running through it, this was not very ideal.

Final Product

For my final prototype, I decided to try a couple of things a little differently.
In order to not rely on solely the programming for the speed reduction, I also used a 1:4 ratio bevel gear tat I 3D printed. This meant that the motor would not need a helping hand to start, and that the housing box could be much more slim. I took this opportunity to laser cut it out of a matte black acrylic with spots for more LEDs, a start button, and a buzzer.
I also tried using the capacitive touch pins on the microcontroller I was using. This meant that the wire would not just be an open circuit, and instead would be sensing for an electric field, much like a phone screen. This meant that as long as the loop of the wand was conducting with the electricity running through the players' hand, the game would operate as normal, without the wand needing to be attached by a wire. This meant that any kind of nudge on the wire from the players' hand or from the wand would trigger a game over sequence.
I added a start button that would initiate the game, so that it is not constantly running and it can sit idle so long as the microcontroller has power and the motor is connected to a 12V power source. I also added a buzzer that added a bit more drama when a player loses the game and so that it lives up to its namesake.

A slim black box with a large spiraled wire coming out of the top, purple start button, a wand, and LEDs around the boarder.
Exploded view of CAD assembly of Proctor Silex Electric knife

Case Study and Teardown:
Electric Knife

Spring 2023
My team and I researched the Proctor Silex Electric Knife and performed a complete product teardown, case study, and CAD assembly.

Project Overview

Teardowns are a useful exercise in competitive benchmarking and in growing engineering knowledge. For this project, my team and I researched the Proctor Silex Electric Knife and performed a complete product teardown and CAD assembly.
This project was composed of 2 parts: the user and usability research, and the product teardown.
Demo Video

User Centered Design

This part of the project involved online research of the product, including manuals, reviews, and videos. This part of the project focused on considering the intended audience and context that this product would be used in
Electric knife design consiterations

First Use

My team and I documented and reflected on our first time using the product. This is important in understanding a user's first impressions of a device. This includes the context of the packaging and the information that it provides as well as how intuitive the assembly was and how we felt about the device using it for the first time.
First use results and thoughts

Knowledge Elicitation

For this part of the project my team and I performed an observation of a peer using the device for the first time. We gave them a specified task and materials to be able to complete the task with minimal input. We also prepared questions to ask throughout the process and after the task.
Peer user testing results

Hierarchical Task Analysis

After our knowledge elicitation, we developed a hierarchical task analysis based off of our user's decisions. This task analysis outlines the task that we gave to our user.
Peer user testing task analysis

Decomposition Table

While performing the teardown of the Electric Knife, my team and I documented the process and developed a table, cataloging every part and its details.
Full Document
Electric knife decomposition table preview

Function Structure

After the teardown, we analyzed the parts and determined how the parts interact, and as a whole how the product operates.
Electric knife subsystem function structure
Electric knife main function structure

CAD Assembly

My team and I were tasked to develop CAD models of each component of the product. We divided the work into 3 sections, the motor, the blade and its immediate components, and the exterior shell and its immediate components, which I was responsible for.
Exploded view of CAD assembly of Proctor Silex Electric knife
This was my first round of investment casting. I hand sculpted the wax molds with cold and hot techniques and casted them in bronze with a centrifugal casting machine.
Image of wax tree with rings on its rubber stand being weighed for casting calculations.
I used a hard blue wax that is meant to be shaped with hot and cold techniques. I used saws, files, and a number of hot tools to get clean shapes. To prepare for casting, the wax pieces are mounted onto a rubber base with wax branches. A cylinder is then placed over the whole thing and investment is poured in to create a mold.
Final rings cast in bronze and polished
The mold is then fired in a kiln for just enough time to burn the wax out of the mold. From there, I used a centrifugal casting machine to ensure that the molten metal completely fills the mold. From there, I sawed off the rings from the tree, filed clean the area, and polished the surfaces.
Photo of Nezy

About me

My name is Marinez Jose, but you can call me Nezy, and I'm a Senior studying my Undergrad in Mechanical Engineering at Tufts University. The core of my motivation stems from my sheer love of creating. For much of my life this fostered into a passion for the fine arts and has shifted gears to engineering. This continues to drive my engineering work and I feel is apparent in many of my designs. Right now I'm very interested in working in robotics, medical devices, and consumer electronics.