Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Corrected links that should have been relative instead of absolute.

Final Writeup

Final link: http://mv.ezproxy.com.ezproxyberklee.flo.org/collegeroute/

Design

Over the course of the semester, our design has changed and adapted accordingly to the many iterations of paper prototyping and user testing.

Our design consists of 4 major tasks:

1. Searching for schools

2. Selecting schools

3. Choosing events at these schools

4. Reviewing the itinerary (we added this later)

When we first started out, we were aiming for a very strict 3 step process - each step was on its own separate page.

As we received feedback from our peers, user tests, and instructors, we drastically changed the flow of our website to embody more of a two-step process. On the first page, users search for schools. The second page is a combination of selecting schools, planning an itinerary, and reviewing the itinerary. We made this decision because we realized that those three tasks are very much intertwined with each other, and it would be very inefficient for the user to have to switch back and forth between two pages.

Searching for schools:

We want to make it as easy as possible for the user to get started. The user can click "Find Schools" immediately, and be taken to the results. However, we'd also like to give the user control over what type of schools to search for. Usually, users have a general idea of what sort of schools they are looking for - for example, they may be looking for schools that will best match the user's test scores and GPA. We realize that the home page is where all new users will first land, so we want to make sure that they can get started immediately. This is why we put the 'Find Schools' button at the very top, in plain view.

Selecting schools:

We knew that planning a college visiting trip usually revolves a lot around location. Most logistics are determined based on colleges' relative locations to each other. For this reason, we used a map to allow the user to visualize the schools that s/he searched for. We decided it was most intuitive to pick a school to visit based on its location. Our map started out at the smallest zoom level, so we can display all school results. We did this by clustering schools in certain locations together. As the user zooms in, the granularity decreases and pins appear for each school. This page allows users to select schools that they plan to visit. They can do this in two ways: users can either select the school from the panel at left, or, they can select the pin representing the school to view a detailed pop out that will allow him/her to select the school. We decided to give the user two options because the affordance of clicking on the pin was sometimes lost on those not familiar with the Google Maps API. Since selecting schools is a very important task, we give a lot of feedback when a school is selected - we add a green border, and move the school to the top part of the panel. We also change the color of the pin.

Planning the itinerary:

The itinerary planning portion of the website was the part that underwent the most changes. The task of event and trip planning is a tough task, in any context. Being able to represent different types of events at different schools at different times that are or aren't selected proved to be our most challenging problem. We went through multiple iterations of lists and panels, and ultimately ended up with a calendar-based interface. We decided the intuition behind a calendar was the most consistent with the user's mental model, since they are trying to schedule what events to attend. To prevent information overload, we decided to split up the events by school so the user isn't overwhelmed by the display of many multi-colored events. One aspect we toyed with was how to represent selected events of different schools. Since we decided to use tabbing to filter our events, we wanted to make sure selected events weren't lost in the mix. Adding tabs to the interface was a risky decision because the affordance of tabs can be vague. We try to fix this by color coding the schools and changing the entire background color to the school's color, so that the feedback from changing schools is very obvious.

Based on our user testing, we also changed how users select the dates associated with their trip. Originally, users had to select the dates and then select "Refresh Events" in order for their events to appear on the calendar. Based on user feedback, we changed this so that events will automatically refresh when a user selects a new start or end date for their trip.

We also introduced safety mechanisms to ensure that when deselecting schools or changing trip dates, users are made aware of whether they would lose any selected events as a result of their changes. If the changes do not affect the selected events, the changes go through immediately; otherwise, the user must confirm that he or she wants to make those changes.

Reviewing the itinerary

When the entire process is done, we realized that the user might want to summarize their plan. While a calendar may be useful for actual event scheduling, it is usually more efficient to refer to a list as an agenda. We fix this by providing an itinerary summary, and a print feature.

Implementation

Our system, built with a Django backend, allows users to store and save the trips that they plan as well as the personal information used to match the user to particular colleges. The system is scalable in the sense that it would be easy to add additional attributes (top majors, etc.) as new search criteria in the future, but some aspects of our implementation have led to latency issues in loading different pages. 

...

We tested our interface on three users. Our two original user classes were students and parents who were applying to colleges and planning to visit those schools. The three users tested were broadly representative of those user classes. The users in the evaluation were personal contacts of our team members (either either family or friends).

  • MIT undergraduate who has several siblings currently looking at colleges
  • Mother of three children with one child currently looking at and visiting colleges and two children who have already completed the college search process
  • Mother of two children who have already completed the college search process (was already familiar with tasks and was not administered standard briefing)
Briefing

Purpose:

  • Parents often plan trips for their high school age children to visit colleges, but there are a lot of challenges along the way that we are trying to alleviate.
  • Find good fit colleges in a certain geographic area
  • Balance multiple campus tours/info session times in one easy place

Tasks:

  • A few tasks as if you were a college junior planning visits to schools

Disclaimers:

  • We’re not testing you, we’re testing the interface, feel free to ask questions, but also try to explore and see what happens

Conclusion:

  • Do you have any questions? If I can’t answer them now because it will interfere with the test I will definitely answer all unanswered questions at the end.
Tasks

Our users were given the same tasks that had been given to those testing our paper prototype. Namely:

...

  1. Calendar view does not default to selected trip dates - Major (Fixed)
    Both Users 2 and 3 could not find events on the calendar originally because the calendar view did not default to the selected trip dates. Therefore, users had to navigate through an unfamiliar calendar interface to the dates they desired. This issue was not fully apparent to the developers because the start date of the trip defaults to today. This issue was easily fixed by changing a setting in the calendar initialization.
  2. Users must hit refresh button to apply new date range - Major (Fixed)
    After selecting the dates, Users 2 and 3 struggled to apply their changes; they expected that the new dates would take effect immediately. They did not realize they had to hit "Refresh Events" in order to apply their changes. This issue issue has been resolved; now, once the start or end dates are changed, the calendar refreshes to reflect the new events.
  3. Users did not understand the map affordances - Major
    Users 2 and 3 did not select schools from the map or click on the map in their initial run through the tasks; older users may be less familiar with viewing and clicking on map clusters. User 1 discovered the affordances associated with the clusters and the map pins by accident. Help text would be a convenient way to allow users to discover these affordances.
  4. Users did not immediately understand the relationship between displayed search results and map display - Minor
    The displayed search results only include those search results that are within the viewport of the map. Text which explicitly labeled the list of results could help explain this (e.g., "Results in map view") 
  5. Users did not understand the ordering of the search results - Minor
    User 3 did not realize that the schools were ranked according to how well they matched the user's search criteria. Using more explicit text for the results list woud also clarify this issue; a header like "Top Results in this Area" could be an effective fix.
  6. Users did not initially understand calendar affordances - Minor
    Despite helptext, some users attempted to double-click on calendar events, which has the effect of selecting and deselecting the event. More explicit instructions could resolve this issue, like "Click once on an event to add it to the trip".
  7. Users could not figure out how to go back and start a new trip - Minor
    The logo in the top-left hand corner could be bigger or become highlighted upon hovering to better indicate a link functionality.
  8. Users creating profiles have to delete the pre-filled values (fragile text does not work) - Minor
    We could change the event handling of clicks inside the input box.
  9. After entering scores for a given test (SAT or ACT), section labels (Math, Reading, Writing, etc.) disappear in the user profile page - Minor
    We could add a set of labels for each of these boxes, or add tooltip text.
General Feedback

All in all, the feedback for CollegeRoute was positive. Users were generally excited about the idea and thought that the overall experience was well-designed and implemented. Very little comments were negative in such a way that required immediate and drastic change or improvement; instead, any constructive criticism seemed to point to features that could be implemented and added on top of the current design.

...

Multiple users commented on the desired functionality to be able to directly view trip logistics and book traveling tickets from CollegeRoute. While this was on the horizon for most of the semester, the ability to interface with actual transportation providing agencies is very possible and would be a welcome addition as a feature to CollegeRoute.

Reflection

Our iterative design process benefited from a solid understanding of the problem; this familiarity with the problem was developed during the need-finding stage of the project. We conducted a lot of interviews to ensure that we had a comprehensive idea of what users would want and need from an application like ours. We quickly narrowed down the user population to focus primarily on parents (or high school students) planning trips to visit colleges.

The paper prototyping process was extremely useful and we were able to get a lot of design decisions solidified or drastically altered simply from ; we solidified or reconsidered many of our design decisions as a result of this low fidelity process. If we were to go through the process again, we would have liked to present alternate ideas to users during this stage so that we could have tested more designs at once and gotten better feedback about what would potentially be intuitive and useful.  Going into the paper prototyping we did have a few alternate designs in mind in case the decisions we made were shown to be difficult or poor, but it would have been nice to have the initial user feedback on these alternate designs as well so that making decisions about what to switch to when changing our first design would have been simpler.

One major example of where we should have used more paper prototyping is the itinerary planning component of our application. The computer prototyping and implementation would have been made much easier had we paper prototyped additional designs for this screen because ; our final design ended up being drastically different from the one that we tested in the paper prototyping. Throughout the semester, we grappled with the issue of figuring out how to design the itinerary planning portion of the site such that user interactions would be intuitive, efficientan efficient, intuitive, and safe interface for itinerary planning. Our design evolved significantly over the course of the semester, with each iteration drawing on what worked and what was challenging from the previous iteration.  All in all we ended up implementing the itinerary planning page about four times, each of which took a lot of time and energy because the implementations were fairly high fidelity because each time we believed they would be the best one. Each implementation required significant investment time and energy.  We are extremely satisfied with our final design decisions on this screen. However, but if we had started the implementation process with a design closer to the final one earlier , we could have made further improvements and implemented additional website features as opposed to re-prototyping this page multiple times.

However, While our lack of foresight with implementation allowed us to truly incorporate user feedback and we all discussed the merits of observations made during user testing.  This was something we did well because we during the prototyping process proved costly, we truly engaged in the iterative design process during the implementation phase. We were willing to throw away entire implementations when we realized that the user was having trouble.  Although it based on feedback received in studio and through the heuristic evaluations.  It would have been ideal to do this more iterations with low fidelity prototypes , for any UI design process in the future we would want to make sure that we had the same end goal of ensuring that the design was based around the as opposed to high-fidelity computer prototypes.  However, we always prioritized the needs and expectations of the user and not based on design decisions over any attachment that we had made, implemented and were too attached tomight have had to a particular design concept.