GR6: User Testing

Design

Overview

ArtBark is designed with two users groups in mind: Artists who are looking for detailed, meaningful feedback about a work, and reviewers who will provide the feedback. As many users will play both roles, the artist’s upload interface, the reviewer’s interface and the artist’s review interface were designed to flow together seamlessly for learnability and consistency.

Accounts

We decided to use a very simple account model for ArtBark. The account design is very similar to other highly efficient web applications such as Imgur, where users do not need to register an account in order to share a photo; rather, content is shared through unique URLs. In our final user evaluations, most of the users appreciated the straightforward and no-nonsense security model.

Login page

The login page was designed to be very simple and straightforward. In our user testing and heuristic evaluations, a number of users reported being confused by the multiple login options we had provided (we had originally given users the option of viewing existing art by personally inputting their email address and the art name, as well as the option of uploading new art). As a result, we chose to present only the most basic login option at the login page. Existing artists and reviewers are only able to access existing works through personalized URLs.

As described, once the artist has finished uploading his/her art, he/she is presented with links for each group that will be viewing the art.


When a reviewer navigates to one of the aforementioned login URLs, he/she will be presented with a customized interface that minimizes confusion. As shown below, a reviewer only sees the single option of viewing the specified work of art. Logging in will take the reviewer to the reviewer interface.


If an artist navigates to the URL for his/her uploaded work, he/she will immediately be taken to the artist interface.

Upload Interface

The artist who wants feedback on their art must first submit that art to our system.  After logging in and indicating that they are going to upload some art, they are taken to the upload page.  The upload page features two ways to upload an image: drag and drop, and an upload button.  Either way, after optionally giving their art a title and watching an image upload status bar, they are whisked away from the upload page to the Review Group Privacy Setup page.

The Review Group Privacy Setup page is a page whose existence fell out of our users’ needs, wants, and demands.  Its basic purpose is to let users retain control over who gets to see others’ comments on their work.  This helps to resolve social dilemmas that may occur when reviewer groups with different levels of social status and social power by letting the user decide to sidestep any potential social problems.

The page features draggable “pills” that represent the different target user groups of our application.  The place where they drop the pill decides the inter-group visibility of the comments.

Once the user continues on from the privacy settings page, they arrive at the Tags page.  This page fulfills users’ need to direct the type and sort of feedback that they receive from their peers and mentors.  Tags are simply short words or phrases, much like those seen on sites like Flickr or Twitter, that suggest a theme for feedback.

Reviewer Interface

The reviewer is expected to receive a link from the artist requesting a review, and they simply enter their email address--an account will be created on-the-fly if it doesn’t already exist to give a low-barrier to entry.

The user is presented with the most recent version of the work (user testing showed that having access to multiple versions was confusion), and any public comments that have already been entered (in this case there aren’t any yet). Reviewers can provide a high-level rating, general comments, or more specific annotations to cover all levels of feedback the artists we interviewed were interested in.

Rating stars are directly manipulated: highlighting on hover, and filling-in on click.

Users can make general comments about the piece that will appear in the container on the right-side. The artist-defined tags help to guide the reviewer’s feedback without restricting it. We had originally required users to categorize their comments based on a predefined set, but in prototyping we discovered that reviewers found this limiting and stressful. Tags offer a more flexible, but still guided user experience.

When the comment appears in the comment panel, the user is also presented with some embedded editing tools. Icons are used to give a quick information scent: The pencil allows users to edit the comment text, and the X-icon allows users to delete comments. When the user deletes a comment (X is highlighted on hover below), a message appears at the top of the interface giving the user the option to undo the action, as per users’ requests in testing. This message does not interrupt user interaction, like a dialog would, but fades after some time, again for flexibility. The user can dismiss the notification manually. This behavior is very similar to the email deletion behavior in Gmail.

Preceding the comment is a pin - on hover, a pointer cursor reveals it to be draggable - users can drop this pin anywhere on the image to associate this comment with a particular part of the art.

In this sense, a pin is an optional feature of a comment. We had originally treated “general comments” and “annotations” as separate entities, but early prototyping revealed that users felt this model to be confusing and inflexible, and difficult to discover in the case of annotations. This integrated, flexible, approached proved much more intuitive in user studies.

Users can also create pinned comments directly by clicking anywhere on the image--a crosshair cursor indicates this option on image hover.


The user can edit pinned comments in-place by double-clicking (shown below) or in the comment display for flexibility and consistency.

  
When the user hovers over pinned comments in the comment panel, the corresponding pin will highlight, so the user need not remember the correspondences (recommended by heuristic evaluation). Likewise, the pinned comment in the panel will highlight when the user hovers over the pin.

Feedback will automatically be visible to the artist, but can be manipulated at any time. Originally, we allowed users to “Push” or “Post” comments to the artist, but they found the automated model more efficient and less confusing in testing.

Artist Interface

The artist can arrive at this interface through the direct link that is provided from the end of the upload interface. The comments and an average rating appear on the artist’s panel, which is designed to be a static-version of the reviewer interface, without the capability to add comments.

Filters

We allow the artist to filter the comments on his/her art using toggle buttons, as seen on the right. While our original design used tabs to allow users to filter the comments by group, we received feedback during our paper prototypes that the tabbed interface was

confusing and inconsistent with the toggle buttons that we used for the tags. Users in subsequent stages of testing felt that the filters were easy to understand and use.

Implementation

Front End

Our interface was implemented in Javascript and HTML. We used the Bootstrap Javascript library for layout and styling, and JQuery for ease of implementation. Drag-and-drop features (i.e. reviewer group set-up and pin manipulation) were implemented using the JQuery UI library.

In general, implementing the interface was straightforward. We did run into some limitations, however, when it came to manipulating the work of art. We originally wanted to allow reviewers to magnify sections of the art, but this proved difficult to implement. We were also unable to go back and make pin positions relative so that the interface could handle window resizes. Proper alignment of draggable objects also proved challenging.

Server

Our server is implemented in NodeJS.  Its primary purpose is to store the data of various users while their session is ongoing and to share that data between the people seeking feedback and the people giving feedback.  All of the user data is stored in a JSON object except for the uploaded art file, which is stored on the regular file system.  The JSON data store is periodically stored to the file system as well, to ensure some degree of fault tolerance in the system. Static files are served using a NodeJS module, rather than using a heavier approach like Apache. 

Accounts

Our account model is very simple. It is assumed that reviewers and artists who have the direct URL to view a piece of art should have permission to view it. A direct URL to an uploaded work of art will contain the work’s title and the artist’s name (which is assumed to be a unique combination) as parameters in the URL. Once a reviewer logs in through this URL, the corresponding image and comments will be retrieved from the server to populate the reviewer’s review page. Similarly, when an artist logs in through a direct URL, the appropriate art image, comments and filtered will be retrieved from the server to populate the artist review page.

While this approach limits the security of our application and makes it open to the possibility of unrelated users guessing the link to a given piece of art, we decided that our design should be simple, efficient and unrestrictive. A more secure design may use some sort of encryption for the art title and artist name, such that an individual who knows the title and the name cannot easily guess the correct URL.

Filtering

Our filtering model is implemented through the use of classes on each comment. When a comment is tagged or created by a specific group, the name of the tag or group is added as a class to the comment, allowing us to efficiently and easily change the visibility of relevant comments using the filter buttons.

Persistence and Editing

All actions that generate, edit, or destroy data visible to other users of the system, or to that same user later, are saved to the application server’s data store.  This means that this data is immediately available to all other users of the application.  In the future, we would plan to implement live updating on the review and reviewer pages, so that updates to things like comments, scores, and annotations would show up without needing to refresh the page. 

Evaluation

Setup

Users were found through personal contacts:

  1. art professor at School of the Museum of Fine Arts (director of Artist' Resource Center).
  2. digital artist making drawings on iPad
  3. digital photographer, tech savvy
  4. RISD-educated Digital Media Artist and Painter with a programming background

They are representatives of our target populations.

We decided not to do a demo, not to bias the learnability of the interface beyond the briefing / tasks.

User briefing and tasks:

Artists' briefing

"You are requesting feedback about a digitized visual art work that you have been working on. You use ArtBark to collect tangible, actionable feedback to improve your art piece. This comprises two stages that are separated in time: 1) uploading the work and requesting the customized feedback from specific individuals and 2) browsing through the feedback after it has been received."

Artists' tasks

  • Setup the review process: upload art, select groups of commenters, and categories/tags desired for the review, invite commenters
  • Review comments/annotations from commenters; Filter by categories (tags) and by groups

Commenters' briefing

"Your feedback about a visual art work is being solicited. You want to help the requester improve the art piece you are about to see."

Commenters tasks:

  • Put general comments and editing, categorize them through tagging
  • Put annotations directly picture, and edit them, categorize them through tagging
  • Rate the art
Usability problems found and potential solutions

User 1: Director of Artist’s  Resource Center and professor at the School of the Museum of Modern Arts in Boston.

  • (Aesthetics) Expect the presentation screen to be more visually appealing to visual artists.
  • Tags input functionality: Artists may not be familiar with tags, she would expect more explanation, e.g. "Encourage your reviewers to enter feedback along these aspects, you will be able to filter when you review feedback"
  • Sharing the links: Would like slightly more explanation, e.g. “Share linked for commenters to review and make comments”
  • Annotation feature was not very discoverable to her, she expected a sentence about it above the image (for first timers).
  • Pins’ colors: perhaps different commenters could have different colors to visually differentiate them

User 2: art drawer who mostly draw digital (iPad).

Artist setUp

  • Title and upload/drag and drop very clear
  • Drag and Drop for groups not clear (Learnability). Long text to read (and is not being read). Tried to click on the group icon and expected it to move. However, once learned, it is very quick and simple to move (Efficiency) and very safe to undo/change. Suggestions: for one of the group icons, before any group icon is moved, have dotted arrows (2) to the first cell of both Public and Private. It makes it very clear that it is to be dragged to one of the locations. Once the icon is placed in one of the cells, the arrows can disappeared

Reviewer

  • Selected Title doesn’t show up (“Untitled”) - Artist/reviewer view
  • Undoing comments: very intuitive, love the safety at the top.
  • Annotation: Didn’t think she could annotate on artwork image directly, but loves the feature.
  • Pins color: likes that they are all in the same color for the reviewer.
  • Usage of the Tags for the reviewer is conceptually not clear at first (also the tags did not show up, so I described what should have appeared).

Artist review

  • Reviewing comments: would expect the most recent comment at the top (reverse chronological order). Also, would expect that mousing over annotation text on the right would make the annotation pin change color on picture (and perhaps show the comment on the picture) too.
  • Tags (visual mapping): As an artist, would expect some indicator of how many comments have been entered for each tag to know at first glance what are the comments focused on. Not necessarily the number itself, but maybe the color value (intensity) of the button can be an indicator for that?

User 3: Advanced digital Photographer (very computer/tech savvy), tested over Chrome on MacBook

Setup

  • Likes drag and drop of the groups – site simple so far.
  • Box of group names should cover the tag dotted lines once on it
  • Say explicitly who are the people in each group
  • Added the commas between words in tags. Not good did it again
  • Would be good to spell check the hashtags
  • URL addresses: Would be Nice to have a button on the right to copy the address to put in clipboard (efficiency)

Reviewers

  • Hashtags (learnability) Some artist  don’t use hashtags
  • Good to use either text for hashtag or button! (efficiency)
  • It would be good to autocomplete the tags as we type
  • Email Identifier is not good In comments of other – replace by name
  • Annotation feature discoverability: If you mouse over as a reviewer: “you can tag an image”
  • When adding new comments, stop the comments view at the height of the picture and have a scroll bar (iFrame)
  • Would like to have each tag to be highlighted and clickable to filter by it (for both artist and reviewer)
  • When there are too many tag buttons, they are spilling over beyond the grey frame container

Artist view:

  • Star rating view: for 3.3. star rating, keep 4th star complete and fill half
  • “I like the filtering it’s cool”
  • For tags: Filtering should be and/or (all or either) right now it’s and, so if I click on mood and color and the comments have only a subset of these, it won’t see.
  • Assign each group to a color, use for both button (filtering) and border of box of comments
  • Should always see the picture, scroll the right comments

User 4: RISD-educated Digital Media Artist and Painter with a programming background

Set-up:

  • Logo gives a good feel for the app's purpose
  • First statement on login is "Upload New Art", so she expected to give an art url, but was asked for an email address, so this was confusing. Possible fix: Change wording to “Sign in to Upload Art”
  • She forgot to add a title to the work, but was auto-advanced by selecting an upload before she could. Possible fix: Add “Next” button instead of auto-advancing
  • Expected to see a thumbnail of the upload and click "next". Possible fix: Add “Next” button instead of auto-advancing
  • Thought the privacy model was straightforward
  • Thought the language of "Feedback Tags" was confusing since she didn't have feedback yet. Possible fix: Change wording to: "What Kind of Feedback Are You Looking For?"
  • Thought that the navigation model from step to step was inconsistent. Possible fix: Always have the same “Next” or “Back” links in the same location for each step

Reviewer Interface:

  • "View Existing" in sign-in has confusing language. Possible fix: Change wording to "Review ___'s Work"
  • Don't know if star-rating is valuable, but especially does not like that it is emphasized more than the comments. Possible fix: Have a more semantic sliding scale from “Needs Work” to “Ready for Showing.
  • Bad information scent for annotations. Possible fix: Have a "click image to annotate" label (possibly in place of the star rating
  • Liked  the annotation interface once discovered
  • Liked deletion and editing interface
  • Wants to be able to add tags on the fly that will appear on the artist's side. Possible fix: Allow reviewers to define tags in comments with the hash symbol, propagate those to the artist interface as well as the  reviewer interface along with the original tags
  • Likes dragging pins
  • Cross-highlighting between pins and comments is not noticeable. Possible fix: Use different color to indicate highlighting rather than just glow / opacity chance
  • Uncovered a bug if you edit annotation within annotation box and comment box at the same time. Possible fix: Close one edit-box as soon as another one is opened

Artist Interface:

  • Likes the simplicity -- very clear overall
  • Wants to toggle annotation pins if it gets cluttered. Possible fix: Add toggle button below the art display
  • Hard to see pin highlight when hovering over comment. Possible fix: Should change color rather than just opacity and the comment should highlight on hover as well
  • Text cursor on comments should go since you can't edit them. Possible fix: Change cursor setting to arrow
  • No place for artist to put information like the artist statement, dimensions, medium, and date. Possible fix: Add this as part of the set up, and allow artists to edit at any point. Maybe display as an expandable panel below the title with a dropdown toggle
  • Wants to be able to add supplementary photos. Possible fix: Allow artists to add more photos during and after setup, and display all of the images as thumbnails to the left of the image display. These images could also act as comment filters in the artist display

From users in the class (studio)

General (Both artist and commenter view):

  • Poor contrast between pins and image
  • Pins icons on comments
  • Hard to see hashtags: could have them as blue links that are the filter
  • Red border around annotation on the right looks like input field on error
  • Should disable mouse interactions with the art piece so that you can't pick it up and drag it around
  • Restyle annotations so that they aren't as big, also maybe recolor them since the red can be hard to see

Setup

  • upload button doesn't work [solved]
  • Make clear which image formats are being supported
  • tags page has multiple "submit" and "next" buttons [solved]
  • make more clear how to enter tags (i.e. comma, space, quote separated) and what they mean, as artists may not be familiar with the concept of tags
  • privacy settings page doesn't allow you to drag a group back to being unused [solved]
  • privacy settings page allows you to drag one group on top of another [solved]

Artist Interface

  • maybe have an overlay on the stars that shows the actual numerical rating (e.g. "Average rating: 3.8")
  • disable crosshair over art (since it implies that you can add an annotation)
  • fix scroll jumping when you toggle filters

Reviewer Interface

  • Empty stars are not very visible [solved]
  • Give feedback when they rate the piece (e.g. "You have rated this piece 3 stars. Undo?")
  • Add feedback/confirmation on deletion [solved]
  • allow commenters to pin comments that are only in the summary section
  • restyle tag buttons so that they actually look like buttons
  • Change mouse marker to indicate draggable
  • When you have a pending annotation open, cannot close annotation window if you change your mind, and cannot click on an existing annotation to close your pending annotation.
  • Expect double click to edit annotation - currently can't edit annotation by clicking on them

Reflection

Prototyping

The various stages of prototyping that we performed were very useful for arriving at our final design. There were a lot of features that we removed and a lot of features that we introduced based on user feedback. We encouraged our prototype testers to think aloud and give us any feedback that they could think of, which resulted in many widely-varying comments that were very useful for improving all aspects of ArtBark. Overall, we learned that multiple stages of prototyping is extremely useful for testing features.

One of the hindrances in our design process was the difficulty of prototyping our interface, particularly on paper. ArtBark is intended to be a highly-interactive application, with features such as drag-and-drop, hover effects, and filtering. We found that such features were difficult to accurately and efficiently represent in a paper prototype. During evaluation sessions for our paper prototype, we often had to explain to our testers the various hover, click and drag effects that were possible in our interface. While all of our users understood our explanations and were able to try out each effect, showing each interaction using slips of paper was slow and clumsy. Such an approach did not allow our testers to freely explore our interface as a new user would. While paper prototyping was still useful, we feel that ArtBark may have benefited from another, longer round of computer prototyping, where we would’ve been able to better represent the possible interactions with our interface.

Incorporating feedback

We feel that our approach for incorporating feedback was very effective. We received a lot of feedback at regular intervals throughout our design process, so we regularly compiled our feedback into a list and then triaged each piece of feedback. This way, we were able to optimally implement the most impactful and time-efficient changes; by the same token, we were also able to cut the more unimportant and time-consuming features. An indication of our success in incorporating feedback is the fact that testers in subsequent stages of user testing rarely brought up complaints about the same set of features.

Overall

We found the iterative design process to be very useful. We were able to continually update and improve ArtBark based on user feedback, and see that the results of our updates were well-received by our testers. It was very rewarding to look back on our first GR assignments and see how much our app had improved.

  • No labels