Cohort One: First Hackathon

By February 21, 2018Uncategorized

Last week, our students completed their first of 3 hackathons! They began on a Wednesday afternoon where every student pitched a project idea and was then assigned groups based on the most popular ideas. Once the students settled into their groups they began sketching out wireframes and thinking through how they needed to structure their applications. All groups were required to implement React, Redux, and an API of their choice. They were given 48 hours to complete their projects before presenting to one another on Friday afternoon.

Here is a breakdown of their completed projects:

Meal Plan
Justin, Jorge, and Ashley

A meal planning application that allows a user to create a daily meal plan by diet type and target calories. A user could even choose certain categories of food to exclude from the results; such as gluten, meat, dairy, and more. We all shared a laugh when we discovered that the API being utilized did not exactly function as intended. Upon doing a search to exclude gluten from any recipes, the first result kicked back was a “peanut butter and jelly pop tart”; which is clearly not gluten-free. The API struggle was real.  Future extensions would be to implement a way to exclude specific ingredients or include only ingredients in a user’s refrigerator or pantry.

 


My Parks
Logan and Austin

Logan and Austin created a parks application that allowed users to search national parks and campgrounds using the National Parks Data API. With a friendly and well-styled user interface, a user could easily find what they are looking for in the outdoors and read details about each of the search results. One of the future features would be to allow a user to favorite certain parks and campgrounds so they could quickly find them later.

 


Digital Telephone Pictionary
Megan and Wes

A digital version of Telephone Pictionary where two computers (APIs) talk back and forth. The user provides a starting word or caption that kicks off the game. One computer receives captions and responds with images while the other receives images and responds with captions. It’s hilariously fun to see how computers interpret the information they’re given. Our favorite result so far: the image recognition tool interpreting a photo of an okapi as “a zebra standing on top of a horse.” One future extension to implement would be a way to allow a user to respond with captions after every image result. Megan and Wes utilized an image recognition API from Microsoft and Google’s Custom Search API.

At the end of their presentations, we celebrated their accomplishments by heading over to Durty Bull Brewing and grabbing some beers. The following week, the instructors sat down with each group and gave an in-depth code review to help the students understand how they could have written their applications more efficiently.

If you’re interested in taking part in the Project Shift experience, we’re now taking applications for Cohort 2. Apply here to learn more.