All Projects

Name

Skills

Year

Prosthetic Foot Test Rig

April 2024
Developed a test rig for prosthetic feet that simulated loading seen in a gait cycle to asses prosthetic feet in how well they mimic the biomechanics of an anatomical ankle be measuring the moment generated about the ankle and angle of deflection of the ankle throughout the gait cycle.

Project Overview

For my Senior capstone project, my team and I were matched with a client at Tufts Medical Center to develop a test rig that assess prosthetic feet, measuring for a quantifiable standard that our client aims to establish.
Index of Anthropomorphicity given Moment-Angle relationship (M. Pitkin)

Background

Current testing standards for prosthetic feet fail to consider how well prosthetic feet mimic the biomechanics found in anatomical human ankles in a natural walking cycle. To account for this, our client, Dr. Mark Pitkin, seeks to establish a measurable standard, the   

Index of Anthropomorphicity(IA). The IA quantifies the concavity or convexity of the moment-angle relationship in a prosthetic foot (moment about the ankle vs angle of deflection). In order to establish this standard, a means to reliably and accurately measure moment-angle data in a prosthetic foot during a gait cycle is necessary.

User Needs/ Engineering Requirements

Design

Key features:
Moment measured via current draw of Bending Actuator
Angle of deflection measured with rotary encoder
Gait cycle simulated with linear actuators and machined Compression Housing
Aluminum extrusion frame allows testing in frontal and sagittal planes for plantarflexion and dorsiflexion
User interface to guide operation

Moment Measurement

Angle Measurement

Body Weight Simulation

Prelim Analysis/Current State

IA characterization of MF foot
Moment angle behavior in a normal ankle from (M. Pitkin)

These are some initial results from our test rig. We tested the Willowwood Meta Flow foot which was designed to have a more compliant behavior that mimics the behavior seen in anatomical ankles. As can be seen in the figures, a normal anatomical ankle exhibits a concave curvature, similarly to the MF foot.
*** These results are from BEFORE we swapped the load cell for a current sensor. Moment measurement is much higher than expected

Detailed results in IA characterization of MF foot

Second Semester Full Poster

Process

This project took place over the course of a whole year only made possible with my awesome teammates, Vivian Becker, and Jack Goldberg, and our amazing advisor, Briana Bouchard. We clocked in hours of work at the whiteboard, at the machine shop, and at the instrumentation lab. I can't show everything, but here are just some snapshots.
The team: Dr. Mark Pitkin, Jack Goldberg, Nezy Jose, Vivian Becker, Briana Bouchard
1st prototype - Proof of concept
Second prototype "finalized" design sketch
Angle measurement whiteboard session
Cut to size aluminum extrusion ready for assembly
The picture I took to document wiring before organizing
Frame assembly drawing
Encoder mount drawing

Robot Clam

April 2024
Developed a pneumatic robot clam that opens slowly and closes rapidly, mimicking behavior seen in clams for locomotion.

Project Overview

For my Printable Robotics final, we were given the opportunity to make whatever we wanted. My team and I decided to explore alternative modes of underwater locomotion and take after clams using pneumatic design to actuate the robot without electronics.

Process

Initial linear actuator explorations - COMSOL Simulation
Extended test with rubber bands
→ 4:1 opening to closing time
Hinge test with rubber bands
→ 2:1 opening to closing time
First shell design, with springs
→ 2:1 opening to closing time, complete closing in 0.07s

Results

Final Design
→ 2 springs held in place with 4 screws
→ Springs specced out to mimic k of rubber bands in tests to retain opening to closing ratio
Testing
→ Used balloon as a membrane to direct water
→ Tested underwater with air compressor at 170 kPa
→ Dye inside clam to visualize flow, rear expusion
→ Future steps would use water pump and mod shell design to produce more lift

Pneumatic Bloom

April 2024
Developed a pneumatically actuated flower that opens and closes in response to light conditions and shadows using a pump and phototransistors.

Project Overview

For an art class called Thinking Machines and Automata, we were able to make whatever we wanted implementing any concept introduced from the class. I decided to develop an interactive sculpture that would be able to respond to a persons shadow. I iterated the design of a classic 3D printed bending actuator to resemble closer to a flower petal and molded laser cut acrylic.

Results

Bending actuator design
Design
Original pneumatic 3D printed bending actuator design
→ Mostly modified for looks and easier actuation
Demo
→ Connected petals connected to a base for one inlet
→ Laser cut and heat formed clear acrylic
→ DC motor air pump hidden in base controls actuator
→ Phototransistors embedded in base control actuation, dark/shadow → bend

Pressure Assessment in Wrist Splints for CTS

February-May 2024
Developed a sensitive, non-invasive, and cost-effective means of capturing relative pressure forces on the interior of a wrist brace intended for Carpal Tunnel Syndrome (CTS) treatment to assess the effectiveness of various splint designs

Project Overview

In this study we developed a cost-effective and non-invasive means to record compression on the carpal tunnel by wrist braces intended for nocturnal splinting for Carpal Tunnel Syndrome in order to consider how different rigid splint designs impact relative pressure distribution around the wrist

Background

Carpal tunnel syndrome (CTS) characterized by compression of the median nerve in the carpal tunnel of the wrist causing symptoms of pain and numbness in the hand and forearm.
The only real noninvasive treatment for CTS is wrist splinting, where studies have shown that internal pressure of the median nerve is lowest when the wrist is in a neutral position. That said, studies have also shown unclear effectiveness in using a wrist splint as a treatment for CTS.
Knowing this, we considered the idea that wrapping a splint around the wrist leads to external compression of the carpal tunnel, negating effects of holding the wrist neutral. We also consider the impact of different wrist splint designs on pressure on the carpal tunnel.

Approach

Splints:
  • Wrist splints use a rigid insert to prevent bending at the wrist
  • Designed 4 different geometries for our tests, maintaining similar stiffnesses with thicker gauge aluminum for skinnier geometries
Data Recording:
  • We used small and slim Force Sensitive Resistors (FSRs) to record pressure data
  • FSRs were positioned as shown, installed with sewn tacks
  • Developed Arduino and LabView programs for data acquisition and live visualization

Results

We tested our designs with the wrist held still and in motion to replicate realitic wrist splinting. The graphs below show results from one trial from the control splint held still and in motion. The average value of each FSR plotted against each other is also shown. Understanding that sensor 4 was positioned right over the carpal tunnel, we concluded that the most optimal design with the one with no bend below the wrist.

Printing Press

April 2024
Modeled a fully functional etching printing press using Solidworks and animated and an assembly

Project Overview

For my Computer Aided Product Design final project I was tasked to make a model and assembly of anything I wanted in Solidworks.

Deliverables

→ Assembly with a dynamic element
→ Exploded view
→ Assembly animation
→ Part drawings for 1-2 components
→ Assembly drawing with the exploded view

I decided to model a printing press focusing on realistic manufacturability. I wanted a project that would be realisitc for me to follow. After some research I found this blog that documented a hobby build:https://www.instructables.com/Build-a-Printmaking-Press/
I followed the general approach, but it proved to not be super detailed and led to me having to make lot of design choices, which was not the initial scope of the project but was fun regardless

Robot Cafe

April 2024
Developed a fully robotic cafe that was able to take orders, make lattes, and deliver the latte to the customer.

Project Overview

In my Intro to Robotics class, the final project tasked us as a class to develop a fully automatic robot cafe implementing topics from the course.

Approach

Since this was a self led class wide project that surmounted to 25 people, it was important to us that we delegated tasks and components of this project appropriately. We ended up dividing ourselves into 4 general teams that varied in size by intensity/demand: Coffee Team, Milk Team, Art Team, and Transport Team.

Overview

The system works as follows:
An Airtable to track progress
Customer orders a latte at the Ordering Station, picking a size and latte art stencil and info is logged into the Airtable
At the Cup Station a cup is dispensed on the platform attached to a Create 3
Create 3 rotates, bringing empty cup into the Coffee Station where the lid opens, pod is dispensed, lid is closed, espresso machine is turned on and espresso is dispensed into the cup
Create 3 rotates to the Milk Station where the milk frother has has been frothing since the order was set and pours the contents into the cup
Create 3 rotates to the Art Station where the selected stencil is positioned over the cup and a solenoid taps a dusting of cinnamon onto the frothed milk
Create 3 rotates the finished latte into the Customer Pick-Up station, awaiting for the customer with an ultrasonic sensor

Coffee Subsystem Process

Given an existing espresso machine, My team was talked with incopertating it into the system and automating it.
Key Points:
Subsystem Airtable to communicate across raspberry pis
Linear actuator to open and close lid
Linkage system and 3D printed channel to dispense pods into position
Relays to control buttons and trigger actions with the Airtable
Individual contribution
Linear actuator to open and close the lid, needed relatively large force
Developed a custom clevis made from two pieces of aluminum sheet metal to wrap around the lid and held in place by a notch in the lid
Aluminum extrusion to mount linear actuator

Results and Areas of Improvement

We intended to be able to make 3 lattes in sequence for the demo, but after the first successful cup, the pods in the espresso machine jammed, leading to the hinge of the lid failing under the force of the linear actuator. This was the unfortunate end to the Robot Cafe, since there was no easy fix for the time frame. I do think there were a couple of things that we could have done to prevent this. In development, my team and I debated implementing limit switches to ensure complete opening and closing, but given the time frame and other more prominent issues, this ended up a lower priority than it should have been. That said, I am still very proud of the results of this project. I think the team coordination of this project was very impressive given the scope and I'm very proud of my team and how it came together.

Object Based Maze Runner

March 2024
Programmed an iRobot Create 3 to recognize a set of objects in order to navigate a maze autonomously using Google's Teachable Machine and a PiCamera.

Project Overview

Using a Create 3 robot from iRobot, we developed a program that would control the robot to recognize a number of objects and make a specific turn based off of the object recognized in a maze. Specifically, the robot was supposed to make a 90 degree turn 6 inches away from each object. We would not know the orientation of the object and we did not know which direction each object would have to indicate until 10 minutes before testing.

Set Up

For the object recognition aspect of this project, we were tasked to use Google's Teachable Machine to develop a model to recognize a total 7 different objects. We used a PiCamera that we would need to develop a mount for. In addition, since the robot needed to stop at 6 inches from the object, some kind of sensor to measure distance needed to be implemented. Ultimately this left us with using either an ultrasonic sensor or the built in IRsensor on the robot.

Approach

The first course of action was to build a mount for the camera. For this, we developed a laser cut piece to secure the Raspberry Pi down and the PiCamera into a fixed position.
From there we collected data of the different objects with the PiCam to train the model and took the TensorFlow OpenCV Keras model to work with.
Programming the actual motion in ROS involved spinning with a RotateAngle action and a driving with a Command Velocity topic. We also decided to use the IR sensors built into the robot for simplicity to detect distance.

Results

We were able to get our robot to respond pretty well to each object. We did run into model and object specific issues with the bear with the small Mario figure that appeared to be a common issue across other teams.
We also ran into a bit of a roadblock when we realized that the positioning of the IR sensors on the robot made it so that at 6 inches, some objects were too short to be caught by the sensor. We also realized that since it is an IR sensor, it simply could not detect all black objects since the black color was absorbing all of the IR and not letting any of it bounce back to be picked up by the sensor. To work around this, hard coded into the program that if the model recognized any of these objects, track how long it has been recognizing the object and at a certain point, about 6 inches, stop and turn. This ended up working pretty well and we considered just using this for all of the objects, but we stuck wit the IR sensors since were proud of getting it to work out.
Final Code
Final built camera line follower robot

Line Follower:
Camera

February 2024
Upgraded the line follower robot to use a camera instead of a color sensor  to follow the contours of the tape, using PID control to correct for error continuously.

Project Overview

Building off a the color sensor line follower, we implemented a camera to track a line better, knowing that the color sensor was rather unreliable. We also decided to decorate our robot as Lakitu, the referee in Mario Kart for the class theme, Mario Bros.

Set Up

An image of the tracks that our robot was challenged to follow is shown here.

Camera line follower track

Process

A SolidWorks assembly of our design is shown here, fabricated with laser cutting and 3D printed wheels.
Upon trying the camera and line detecting for the first time, we noticed a drastic difference between the camera and the color sensors reliability, and getting the robot to identify the line accurately was rather straightforward, and we could put more of our focus onto implementing PID. 
Solidworks screenshot of Camera Line follower robot assembly with one side wall missing to show internal components

Results

We were able to get the robot to follow the red and purple lines rather well and pretty consistently with good speed.
The camera and our current image processing settings made it hard for the program to recognize the contours of the green and blue lines, so it was not able to follow those lines. We do think that those lines could be feasible with just a little more time. We also were not able to do the track with the sharp turns, and we attribute this to our robot as a whole being pretty clunky and the wheels being so far from the camera, but we did have ideas for how to get around this given more time.
Final Code
This was my first round of investment casting. I hand sculpted the wax molds with cold and hot techniques and casted them in bronze with a centrifugal casting machine.
Image of wax tree with rings on its rubber stand being weighed for casting calculations.
→ Hard blue wax, shaped with hot and cold techniques (saws, files, hot tools)
→ Wax pieces are mounted onto a rubber base in a tree with wax branches
→ Poured investment over wax to create a mold and fired to burn out wax
Final rings cast in bronze and polished
→ Cast in bronze with centrifugal casting machine, cleaned up and polished by hand

*** Not imaged but after polishing I was also able to electroplate the rings with a basic solution, some sterling silver wire, and a 3.3v pin from an ESP32
The KEN robot car with three large monster truck wheels with added rubber bands for traction, a raspberri pi and protoboard with wires sticking out on top of the bed and a guiding beam towards the back of the car.

KEN:
An Intrepid Robot

December 2023
KEN, named after its makers, Kimberly, Ed, and Nezy, is a robot designed for partnered autonomous straight ramp ascension with a payload.

Project Overview

In my Electronics and Controls 1 class, for our final project, we were assigned a 2 part challenge. For part 1, the goal was to build a robot to simply travel up a given ramp with a controller. Part 2 of this project was to modify our robot to autonomously push a large cardboard tube up the ramp with the help of another team's robot without letting the tube fall off the ramp.
Demo Video

Set Up

The ramps that our robot climbed and the set up with the tube are described in the figures shown. They were built out of birch wood and were available to us throughout the building process for testing.

For part 2, grip tape was added to the surface of the ramps to aid in traction. There were two ramps available to test with other teams and collaborate while fabricating. We were also provided the tube that we would be using for the actual presentation to test with.
Schematic of ramp
Sketch of ramp set up

Part 1

→ Pretty simple electronics using Raspberry Pi and just a MOSFET for one direction DC motor control
→ Robot only needs to drive straight up a ramp and later needs to be autonomous
→ Simplified design, 2 wheels driven by one motor with a 4:1 gear ratio, 3rd free spinning back wheel
→ Wide wheels with rubber bands for better traction
→ Wheels on a threaded rod, held in place with nuts and bearings
Initial sketches of robot design and organization
Initial planning
Wheel assembly
Wheel assembly
Detail shot of robot solidworks assembly
Drive train diagram
Robot controller interface with a wall-e theme
HTML controller interface

Part 2

→ Calibrated robot speed from duty cycle to mm/s for robot communication - video shown
→ Mechanical approach to autonomously stay on ramp, railing guards that hang down the ramp - pvc with wood filled with nails for weight and tape for less friction
→ 2 buttons at far edges of the front of robot to sense tube position and control speed
→Developed custom buttons with acrylic, springs, and copper tape to depress under weight of tube
→ This allowed the robot to perform the task without robot communication if time did not allow
→ Developed an assembly for rapid changes and simulation before building
Solidworks model of the whole robot assembly

Results

→ Railing guards worked much better than expected
→ We did not have enough time to finalize robot communication, but given one more day, we think we could have gotten it
→ Buttons worked great, sensitive enough for the tube
→ Wheels slipped during demo → added weight of the tube when partner robot stalled, we could have made the robot itself heavier to help traction
→ ***Did not know about PID at the time, took a bang bang control approach, if one button depressed, speed up or slow down depending on which ramp and which button, if no button depressed, higher speed, if both depressed, base agreed speed with partner robot - for the purposes of this project, it worked pretty well
Final Code
Used KiCAD to design a PCB to house an H-bridge motor controller to manufacture through Oshpark
→ Labels for easy soldering
→ LEDs to indicate power and direction
→ Silkscreen fish on the back for identification
Back of PCB
PCB Front
Exploded view of CAD assembly of Proctor Silex Electric knife

Case Study and Teardown:
Electric Knife

Spring 2023
My team and I researched the Proctor Silex Electric Knife and performed a complete product teardown, case study, and CAD assembly.

Project Overview

Teardowns are a useful exercise in competitive benchmarking and in growing engineering knowledge. For this project, my team and I researched the Proctor Silex Electric Knife and performed a complete product teardown and CAD assembly.
This project was composed of 2 parts: the user and usability research, and the product teardown.
Demo Video

User Centered Design

This part of the project involved online research of the product, including manuals, reviews, and videos. This part of the project focused on considering the intended audience and context that this product would be used in
Electric knife design consiterations

First Use

My team and I documented and reflected on our first time using the product. This is important in understanding a user's first impressions of a device. This includes the context of the packaging and the information that it provides as well as how intuitive the assembly was and how we felt about the device using it for the first time.
First use results and thoughts

Knowledge Elicitation

For this part of the project my team and I performed an observation of a peer using the device for the first time. We gave them a specified task and materials to be able to complete the task with minimal input. We also prepared questions to ask throughout the process and after the task.
Peer user testing results

Hierarchical Task Analysis

After our knowledge elicitation, we developed a hierarchical task analysis based off of our user's decisions. This task analysis outlines the task that we gave to our user.
Peer user testing task analysis

Decomposition Table

While performing the teardown of the Electric Knife, my team and I documented the process and developed a table, cataloging every part and its details.
Full Document
Electric knife decomposition table preview

Function Structure

After the teardown, we analyzed the parts and determined how the parts interact, and as a whole how the product operates.
Electric knife subsystem function structure
Electric knife main function structure

CAD Assembly

My team and I were tasked to develop CAD models of each component of the product. We divided the work into 3 sections, the motor, the blade and its immediate components, and the exterior shell and its immediate components, which I was responsible for.
Exploded view of CAD assembly of Proctor Silex Electric knife
Photo of Nezy

About me

My name is Marinez Jose, but you can call me Nezy, and I'm a Senior studying my Undergrad in Mechanical Engineering at Tufts University. The core of my motivation stems from my sheer love of creating. For much of my life this fostered into a passion for the fine arts and has shifted gears to engineering. This continues to drive my engineering work and I feel is apparent in many of my designs. Right now I'm very interested in working in robotics, medical devices, and consumer electronics.