With the first quarter coming to a close, we have set up a powerful infrastructure for filling out the features of Quick & Fresh. Our latest iteration allows us to query for a restaurant by name and location, and retrieve a list of menu items that meet a calorie criteria. We also have api endpoints available for individual pieces of this request, such as location only or menu items only. This is going to allow a lot of flexibility in structuring our front end.
Our front end is also capable of displaying this data to the user with Google Map integration and we will be able to rapidly style the pages with twitter bootstrap. Angular.js - our front end MVC framework allows us to reuse templates and modularize our code effectively.
Looking towards Winter break and Q2, we will have several questions and challenges we need to address:
- How do we plan on caching and transforming data in the backend? Do we keep a NoSQL solution, or add a SQL database.
- How can we group data? Can we categorize menu items, or do we just show them in alphabetical or caloric order?
- What are our UX goals? What interface is going to be most effective in meeting user needs?
- Each team member should focus on a certain piece of the app so we can direct our research efforts.
This week we had successes all around in creating various parts of the app. We have calorie and location data on the backend from two 3rd party apis, and could serve it up to our web app to display on a map. Now, we need to respond to user search criteria and display the results based on the data. So far we have a query that can accomplish this, but now we need a data structure that combines all of our data sources and persists them as well.
Storing data locally is going to be a challenge: Restaurants have locations and menus, menus have menu items. Clearly this is a fit for relational storage, but we are currently using a NoSQL solution. Using Mongo as our backend might not be the best choice for storing the information locally, and may require another data management solution along side it. For now we will continue with our infrastructure, but we are going to make sure to abstract away the data access layer so we can cut over to another solution later in the design of the app.
Require.js was initially useful because it modularized our code and kept us from polluting the global namespace. However, time and time again we are running into problems where 3rd party libraries assume that X framework is available, but they are now hidden away in their own module. This is currently causing us more headaches than a global namespace. There is a painful decision ahead regarding the best way to move forward:
- Ditch require.js: This option will lend itself towards plug-n-play development and prototyping, but may cause trouble later on. It is also painful because we need to de-require.js our current app, as it has been set up assuming require is being used.
- Require.js deep dive: We dig deep into require.js and its examples so we know what it is expecting with 3rd part integration. This choice feels right from an architectural standpoint, but will take time to learn the framework. Time is not a commodity we have an abundance of.
- Find a specific example of google-maps + require.js integration: We fix this particular problem as fast as possible and move on by looking for examples and solutions for our particular problem. This might get us hooked into the google maps integration we need, but also has the possibility of inducing more problems as we dive deeper into this specific use case without expanded knowledge of require.js.
This week we completed and end to end prototype of the site and its RESTful architecture. This includes:
- Adding data to our database via a POST request to our services layer.
- Requesting a list of users from our service layer via GET request.
- Displaying that list as formatted HTML via Angular.js
My main focus this week was integrating Angular.js into our app as a front end client. I used their example project and modified it to fit our needs. While initially daunting to pick up its conventions, I feel like I am beginning to see Angular's potential and its ability to automatically bind our JSON data model from the server to the HTML templates on the site. Angular is also pretty smart about handling routing and sending http requests to the our server and handling the SUCCESS or FAILURE responses in the background.
While we need more time to research the best practices of the framework, I believe this proof of concept hints at the capabilities of this front end platform choice.
This week I worked on starting up various pieces of the app and connecting them. The first challenge was to set up the Play Framework. as a RESTful server. This is the the middle tier of the app, so its the central point where the user clients and the data layer meet. After a few false starts I found the correct method to instantiate a project and integrate it with an IDE:
- Use the play shell to generate a new project.
- Use the play shell to create an .idea or eclipse project integration WITH the source.
- Start the Play server from the play shell.
The next step of making Play behave as RESTful server was to add controllers and routes to the app which return JSON from HTTP requests. This was straightforward, as JSON is a supported return type by default: no need to specify MIME Type or specify the headers of the response. We had completed the DB->Play server integration early this week, so this leaves the front end.
Angular.js is the current candidate for front end communication, and as a separate server it might even be trivial to get it set up. However, Play supports the deployment and minification of JS assets, which will be helpful in the future, but requires a bit of exploration to determine Play's patten of JS integration. Hopefully this is something I can figure out by our meeting tomorrow.
Last week we settled on the platform for our application. We are using the Play Framework for our app's backend services and using Angular.js as a client side MVC framework. The data will be stored in a NoSQL database until/unless there is an adequate case to switch to a traditional SQL platform.
We set out to install the major components of our platform locally, so that each member of our team can get used to setting up a local development environment. Local environments are important because we do not want to tie our productivity to our ability to connect to a remote resource. The tradeoff is is a time investment for each team member, and the local environment deployment will not mirror the actual physical deployment of our app when the time comes to create a development and production server.
Getting the required open source libraries installed was simple on OSX, simply download them and reference them in the Path. What immediately became frustrating was a bug found when using the Play Framework Activator, a download and template manager for instantly setting up common Play Framework projects and dependancies. It makes me a little nervous that the 1.0 release of their app manager prevents OSX users from running the application. Hopefully this will be fixed soon so I can use a best practices template for creating an Angular.js + Play project.
Looking for samples of Play + Angular interaction, I stumbled across sse-chat, a sample app that uses our chosen framework. I will create a project based on how this project is structured to get a sampling of how a Play backend would interact with a JS front end. Overall I am excited about the platform we have chosen, and hope that my teammates will have more success in setting up a turn-key solution.
Next week we will meet to solidify our dev environments and address any issues we had. We will also begin planning to implement the high level functionality of QuickandFresh and flesh out more defined development roles for team members.