Design Description:
|
Logging in |
|
My Uploads |
|
My Uploads: Dragging Files |
|
My Uploads: Dropped Files |
|
My Uploads: Upload Progress |
|
My Uploads: Uploaded |
|
My Uploads: Deleting Album |
|
My Uploads: Editing Album Details |
|
My Uploads: Release Date |
|
My Uploads: Adding Genre |
|
My Uploads: Editing Track Name |
|
My Uploads: Album Approval |
|
My Uploads: Searching Uploads |
|
Reporting |
|
Reporting: Filtering |
|
Reporting: Export to Excel |
Implementation Description:
Describe the internals of your implementation, but keep the discussion on a high level. Discuss important design decisions you made in the implementation. Also discuss how implementation problems may have affected the usability of your interface.
The interface was implemented on top of Django, with user-side UI code making use of jQuery and Bootstrap, as well as a number of UI-related libraries such as LESS to manage CSS styles, X-Editable for in-place editing, and Tablesorter for sorting tables.
Django enforces a model-view-controller-like design strategy, which was mirrored in the Javascript code. Each conceptual function (listing uploads, editing uploads, and reporting plays) was divided into separate JavaScript code and HTML templates for ease of maintenance. Django models such as UploadGroups, Albums, and Tracks were mirrored in the JavaScript code with corresponding classes and objects used to help maintain UI consistency by posting events relating to those objects to the window (to allow list entries to dynamically update when the contents of in-place editors are saved, for example).
Because the code was initially written to map to the existing WMBR music database schema, a number of limitations arose from requirements imposed by Django (such as the requirement of exactly one primary key column, and an inability to use Django's custom column types like FileFields). Some of these issues were resolved by simply migrating to an appropriate object design and backing table, while others were not resolved at this time (e.g. genres are stored as comma-delimited strings rather than as separate models using a join table, which meant that JavaScript code currently splits a string for handling as separate tag buttons).
Perhaps the most complex code was that which permitted drag-and-drop uploads in the first place. Although the HTML5 drag-and-drop and file APIs did not actually restrict what was possible, practically all drag and drop code (including selection, dragging, and dropping of MP3 files in separate upload groups) had to be written from scratch from the underlying drag events. Furthermore, different drag and drop event ordering in Firefox and Chrome required careful management of "dragleave" and "dragover" events to correctly manage the fading of the "Upload" overlay.
With respect to uploads themselves, limitations in HTML upload functionality prevented the "pause" button dictated by our earlier feedback from behaving properly. At the moment, it is currently a placebo that does not actually effect the upload, as the only other alternative is to cancel the upload entirely and restart it, which seems equally undesirable. Given the original use-case which justified the need for the "pause" functionality (i.e. pause would be clicked before "Delete Upload" which should cancel the upload in any case, it seems that a placebo effect would not have significant downsides although it might have significant upsides in basic learnability and initial comfort with the design.)
Other conceptual design issues stem from the bifurcation of JavaScript and Django code. For example, not only are filetypes checked in the Javascript code when supporting drag and drop, but they must be checked again when the upload is complete to parse music metadata. This is inefficient, especially as it means that additional filetypes must be added in more than one place.
CSS limitations in the implementation of X-Editable prevented the use of the "pop-up" design of album-level editable details with track-level details. As a result, consistency was necessarily broken between album-level details (which presented informational popups) and track-level details (which were in-line). Users generally did not seem to have much of an issue between the two, but several users expressed some general concerns about when data was saved regardless of the editor. The benefits of the pop-up balloon for learnability, however, far outweigh the costs of inconsistency in not being able to use pop-up balloons for track-level ordering.
Similarly, limitations in X-Editable prevent using a date selector along-side a text-editor for date, without which, release dates may take longer to edit due to the need to search a calendrical display. A ComboDate editor was evaluated, but due to the ever-changing bounds of release dates (e.g. the system may need to support dates for many years in the future as well as many years in the past, making the year combobox difficult to scan), this may not be a sufficient replacement. As such, we stuck with the calendar editor, although it is not likely to be ideal.
In the reporting mode, the most significant implementation issues arose from filtering by genre and release date. Filtering by genre and reordering the table were implemented using client-side code, which provided very responsive behavior that did not depend on the server. Filtering by release date, however, required pulling the partially-rendered HTML of the table, since the number of releases over all time is likely to prove unacceptably large for client-side filtering by release-date. The trade-off in making this code run on the client-side is that it is unclear how well the table behaviors will work at scale. Although server-side filtering is easily benchmarked, it is unclear how quick sorting and filtering will be on tables of more than a few dozen rows.
Finally, space limitations from the standard HTML table layout algorithm resulted in the cramping of the "Play Count" columns, which may impede readability and understanding of the durations over which play counts are aggregated.
Evaluation
User Population
All users are Music Directors at WMBR and fall directly into our target user population. We connected with these users through a group member who is a DJ at the station, but not a Music Director (hence satisfying the project requirement that none of the group members fall into the target user group).
The 4 users are within the age range of 20s - 60s, with varying levels of technical proficiency. Three are female and one is male. All had a need for a music importing system such as KaJaM! Some use the uploading feature more while others are more concerned with the reporting function. Each Music Director is typically in charge of one single music genre. Some music genres deal with more digital content than others, hence usage of the KaJaM! application is expected to differ across users.
Conducting User Tests
User testing was conducted at the WMBR Radio Station located on MIT campus, lasting on average 30 minutes in length. We briefed the users by explaining the scope of our application (limited to importing music, not playing or managing the library), and our hypothetical scenario. The briefing is kept consistent across users by showing them the same instruction sheet. We also explained that we are interested in how users naturally interact with the interface, and assured them that there is no right or wrong way to use the application.
All users interacted with a Macbook laptop owned by one of our group members and used the attached Touchpad + Keyboard to navigate the KaJaM! interface. One group member was acting as the facilitator and was in charge of walking the user through the scenario tasks, providing assistance only when necessary. The remaining two members sat alongside the user and noted down critical incidents + usability issues.
We decided that a demo is not necessary due to the intuitive interface of the application. The scenario tasks presented to all users can be found as follows:
Scenario |
Task |
---|---|
Imagine you are a music director, Lana, and you wish to upload some albums to the digital library via the KaJaM! interface |
Open KaJaM! and log into the system |
You have downloaded 2 albums onto your computer: Starmarker.zip and KaJaM.zip and wish to import them to the library |
Import Starmarker.zip and KaJaM.zip into the digital library |
Before approving the KaJaM! album, you want to listen to one of the tracks |
Play a track in the KaJaM! album |
The track from KaJaM! album sounds familiar, and you quickly realize you have already imported the album and don't need it again |
Delete the entire KaJaM! album |
The Starmarker.zip file finishes importing, and you wish to make sure all the track details are correct |
Inspect track details |
You find incorrect and incomplete track details you want to fix |
Edit track details to reflect the correct information* |
You wish to approve the album to officially file it in the digital library |
Approve the album |
Before you leave, you wish to complete College Media Journal reporting for the month |
Take a look at the reporting data in the digital library |
You notice the reporting data is displaying all genres, but you want to look at only "Electronica" |
Filter the reporting data to show only "Electronica" |
Now all reporting data is on "Electronica" only, but it is not sorted by play count in the last month |
Sort the reporting data by play count in the last month, in descending order |
Everything looks good, but you still want to make adjustments to the reports based on physical CD play count before submitting them |
Export the reporting data to Excel file format for further editing |
All done with the tasks! |
Done! |
*Assume Lana Googles the correct album information, so facilitator provides all the correct information to the user on a separate sheet of paper:
Track # |
Track Name |
Artist |
Album |
Label |
Release Date |
Genre |
---|---|---|---|---|---|---|
1 |
While I'm Dead |
Starmarker |
Killer Kilometer |
Polyvinyl |
2013-02-20 |
Electronica |
2 |
Sand Beauty |
Starmarker |
Killer Kilometer |
Polyvinyl |
2013-02-20 |
Electronica |
3 |
Silver Lining |
Starmarker |
Killer Kilometer |
Polyvinyl |
2013-02-20 |
Electronica |
Usability Problems Observed
Describe the Problem Here. (Severity Level Here)
Details Here.
Potential solutions:
Details Here.
Describe the Problem Here. (Severity Level Here)
Details Here.
Potential solutions:
Details Here.
Describe the Problem Here. (Severity Level Here)
Details Here.
Potential solutions:
Details Here.
What Users Liked
Details Here.
Reflection
Discuss what you learned over the course of the iterative design process. If you did it again, what would you do differently? Focus in this part not on the specific design decisions of your project (which you already discussed in the Design section), but instead on the meta-level decisions about your design process: your risk assessments, your decisions about what features to prototype and which prototype techniques to use, and how you evaluated the results of your observations.