Programming & IT Tricks . Theme images by MichaelJay. Powered by Blogger.

Copyright

Facebook

Post Top Ad

Search This Blog

Post Top Ad

Responsive Ads Here

Archive

Post Top Ad

Contact


Editors Picks

Follow us

Post Top Ad

Fashion

Music

News

Sports

Food

Technology

Featured

Videos

Fashion

Technology

Fashion

Label

Translate

About

Translate

Sponsor

test

Weekly

Comments

Recent

Connect With us

Over 600,000+ Readers Get fresh content from FastBlog

About

Showing posts with label amazon. Show all posts
Showing posts with label amazon. Show all posts

Monday, January 8, 2018

UI/UX Case Study: Mobile Self-Checkout App Design Concept


Fashion Retail and E-Commerce App Redesign

The UI/UX case study documents the processes involved in a redesign of a fashion retail and e-commerce app. The app includes a product scan feature for customers to perform a self-checkout at a physical store. This design sprint took 11-days to complete and is submitted to the UXDI course at General Assembly, Singapore.

Project Brief

Work in a team to identify problems and/or opportunities with an existing mobile application and utilise your knowledge to design a solution.
For this project, my team selected the Uniqlo, Singapore app to redesign. For this documentation, the brand name will not be mentioned again below. The ideas below apply to most fashion retailers with an e-commerce presence.

Overview

The 11-days group project (3 members) includes the following processes and methodologies:
#1 Discover
  • Background research
  • Contextual inquiry
  • User Interviews
  • Online surveys
  • Competitive analysis
  • Heuristic evaluation
#2 Define
  • Affinity mapping
  • User personas
  • Customer journey mapping
  • Feature prioritisation
  • Design studio
#3 Design
  • Wireframe
  • InVision prototype
  • Visual mock-up
#4 Testing
  • Usability testing
  • System Usability Scale study
#5 Deliver
  • Interactive prototype
  • Visual mock-up
  • Research report
  • Presentation

The Context — Competitive Retail Scene & Mobile Payment in Singapore

News clippings from The Straits Times and Channel NewsAsia
It is increasingly difficult for retail businesses to remain competitive in Singapore. This is due to the high rental fees to maintain a physical store and the difficulty in hiring lower-skilled sales assistants.
In addition, consumers are increasingly shopping online on platforms such as Taobao, ASOS, ZALORA for their fashion fix.
In the recent Singapore National Day Rally Speech (Aug 17), the prime minister pushes for consumers and retailers to adopt mobile payments. This will be done through initiatives such as ‘PayNow’ and a common national QR code.
Based on this setting, my team picked a retail outlet with a physical and online (mobile app) presence as our project.

Heuristic Evaluation

Screen grab from heuristic evaluation report–Consistency and Standards
First, we started by analysing the existing app to identify key problems and issues. This was compared to online reviews on Google Play and the App Store.
App reviews on Google Play and Apple App Store
The main issues discovered was the app is a hybrid app, i.e. it is pulling information from a web page. This creates potential issues:
  1. Slow loading as most of the information is downloaded only when needed.
  2. The experience is not catered to mobile. Fonts, buttons, and images appear too small on the mobile phone.
  3. Navigation is inconsistent throughout the app.

Competitive Analysis

Competitive Analysis–Comparing features on the Home screen
Next, we compared the app to the competitors’ apps. The competitors are determined by these 4 points:
  1. Fashion retailers with a physical store in Singapore;
  2. Has an e-commerce mobile app;
  3. Similar price range and demographics;
  4. Fast-fashion retailer.

Key findings identified were:

  1. Most shoppers do not know of or use the apps;
  2. No in-store signages were found to encourage its usage;
  3. Competitors have a barcode scanner to provide additional product information. This feature integrates the in-store and mobile experience;
  4. Competitors have a significantly better app presentation as it feels less cluttered;
  5. App approaches may be different — one is more editorial, while the other is focused on e-commerce.

The Big Questions — How Might We…

At the start of the project, we had three main questions in mind.
How might we…
  1. adapt the physical store experience into a mobile experience?
  2. use a mobile app to further enhance the physical store experience?
  3. adopt mobile payment or a mobile self-checkout at a physical store?

Defining the In-store Experience

First, we define the unique experience at the physical store to adapt it to the mobile app.
  • Greeted with ‘Welcome to (the store)’ every time you enter the shop;
  • Same familiar shop layout at every outlet;
  • Wide open aisle, bright lights, neatly stacked shelves;
  • Sales and promotions throughout the year for different products each week;
  • Easy to find the right sizes without help from a sales assistant;
  • Strong visual branding from clothes tag to signages.

Contextual Inquiry/Field Study

Contextual inquiry at the physical store
We conducted a field study at an outlet by speaking to customers and shop assistants. Also, we showed the product scan feature found on our competitor’s mobile app.

What we noticed and found out:

  1. Shoppers do not know of the app even though they frequently shop at the store.
  2. The current app is for e-commerce only.
  3. Shoppers will consider shopping online after knowing of the app.
  4. Shoppers are wowed by the product scan feature as the technology is fascinating. The same technology is already available at a kiosk at the flagship store (in the city). Other smaller outlets (in the neighbourhoods) did not have this kiosk, probably due to space constraints.
  5. The same product might be cheaper in the app as there are mobile exclusive discounts. This, although they may incur additional delivery fees ($6 for spendings < $30).
  6. Various products are available only at the flagship store or on the mobile app.
  7. Long queues were observed at the store during peak hours.

User Interviews

Sample interview questions grouped by topics
We interviewed 7 users to find out what they think of the current app. The questions we asked were centred on various touch points common to fashion e-commerce apps. For example, we asked questions related to:
  1. browsing for clothes,
  2. making a purchase,
  3. waiting for the delivery,
  4. receiving the items,
  5. and making returns.

Key findings from the interviews:

  1. The app is easy to browse, hence there are no major issues with the navigation. The only issue is with ‘dresses’ being classified under ‘tops’.
  2. Frustration comes from the lack of filtering and a complicated check-out process.
  3. The app lacks clarity in the delivery options and fees.
  4. The app presentation is messy.

Affinity Mapping

Existing App User
Shopper–Potential App User
After conducting user interviews and contextual inquiry, the next step we did was to organise the insights into groups in an Affinity Map. With this map, we could identify common habits, problems, and pain points. The map also helped us to identify 2 key personas (elaborated below) where the same coloured post-its are usually grouped together. Eg. Red and pale blue posts-its are existing users.

User Personas and Customer Journey Map

Based on the patterns identified in the affinity map, we came out with 2 personas — an existing user of the app, and a current shopper who is a potential user of the app. These personas describe a typical user/potential user, their habits, problems, pain points, and other details about him/her.
Persona 1 — Existing user of the app
User Persona–Existing user of the app
Customer Journey Map–Shopping on the App
Katie prefers to shop online and is an existing user of the app. She wants quick access to all the discounts and finds it difficult to find the size and availability of the items she wants. While she is familiar and comfortable using the app, she hopes the user experience can be improved.
Persona 2 — Existing shopper and potential user of the app
User Persona–Shopper at physical store/Potential user of app
Customer Journey Map–Shopping at a physical store
Natalie shops at the physical store and is not aware of the existing app. While she enjoys shopping at the store, there are often long queues at the payment counter. She may be a potential user of the app since she uses other e-commerce apps to shop for clothes.

Potential project approaches:

  1. The redesign should not affect current users of the app. Navigation should be kept similar to the existing app and website.
  2. New features can be added to the app for current shoppers to use it in-store.
  3. Users should be able to access ‘Promotions’ quickly since it is a major feature of the brand.
  4. Increase awareness of the app through in-store posters and other marketing efforts.

Feature Prioritisation Matrix

Feature Prioritisation Matrix–User Needs vs Business Needs
57% of users surveyed rated a 4 or more, that is important to have a self-checkout counter in-store
Through a design studio process, we came up with various new features we intend to include in the new app. To come out with a Minimal Viable Product (MVP, or Minimal Lovable Product, MLP) we conducted an online survey to find out what users want on the app. We looked at the features from the business perspective and organised them according to our user and business needs. Features at the top right corner (the box in red) are the ones that should be included in the new version of the app.

Storyboards

The new features are illustrated in storyboards, detailing the environment, scenario, and context where the app may be used.
Storyboard by Parul–Receiving a push-notification when user is near the store
Storyboard by Parul–Using the barcode scan and self-checkout function

Mid-Fi Prototype & First Usability Test

Mid-Fidelity Prototype by Parul
Since my team comprises two visual designers (myself included), we skipped to a mid-fidelity prototype after doing quick sketches. Visuals of the clothing may be important in helping users visualise the actual app.
The version was used for testing with actual customers on our second trip to the store. The purpose of the test is to determine if customers are receptive to the new scanning and self-checkout feature.
Key findings from the usability test:
  1. Customers are able to identify the scan feature and its uses.
  2. Most customers are able to expect what will appear after scanning the product.
  3. However, they questioned the need to know more product information when they have the physical item on hand.
  4. Customers will use the self-checkout ‘only when there is a queue’. This is to be expected since most Singaporeans are more comfortable making payment by cash at a counter.
  5. However, most highlighted there they are slowly accepting mobile payments and self-checkout systems as part of the future retail experience.
  6. The wishlist feature was removed subsequently as users do not require the function.

Hi-Fi Prototype

From the usability test, we iterated a high fidelity prototype. The branding was also enhanced in the design by using the right fonts and colours. The interactive prototype can be viewed on InVision.

Feature Demonstration

Scan Feature

We created a video to show the new scan feature on the app since it was impossible to prototype the actual feature.
App Scan Feature Prototype Demo

Delivery Target Bar

Another feature on the app is an animated target bar for free delivery. This will encourage users to spend more to meet the target while providing greater clarity to the users.
Animated delivery target bar

Geo-Fencing Push Notification

Users will receive a mobile coupon through push notification when their GPS indicates that they are near a store outlet. This will encourage them to use the app for self-checkout.
Receiving a mobile voucher through push notification when user is near an outlet

Usability Testing

Tasks assigned for usability tests
Participants were given 4 tasks to complete. Task 1 was conducted on both the existing app and the new app. The clicks for 3 of the tasks were illustrated below.
Where did the users click?
Time taken by user to find a dress on the current and new app
To collect quantitative data, we timed users on how fast they took to complete the task on the existing app and new app. The new design allows the user to complete the task more efficiently.
Quantitative Data from System Usability Scale(SUS)
In addition, we conducted a post-test survey to collect feedback from the participants on their views of the new app. This was done with the System Usability Scale(SUS) test. The results were tabulated and calculated based on the method specified by the standardised test.

Results from system usability scale test:

  • Users rated 69/100 (marginal) on their opinion of the new app.
  • Although this is below the acceptable score (>70), it was not a bad score.
  • The marginal score was due to the difficulty in performing task 4 (i.e. performing a self-checkout).
  • Designing a self-checkout is a challenging task due to the lack of existing models to follow. Users need time to learn and accept self-checkout methods.

Design Iteration — Improving the User Flow

After the usability test, we discovered that users were confused by the product detail page after scanning the barcode. They assumed that the item was already added to the cart after the scan.

Scanning Products

Revising the user flow to provide more feedback
Providing feedback to guide users in completing their task
We made the process more informative for users by providing feedback on what is happening. First, we prompt users if they want to add the item to the bag after scanning the barcode. Next, we gave them the option to continue scanning or proceed to the shopping bag. This provides more clarity to the user as they are provided with options to proceed to the next step.

Self-Checkout

Revising the self-checkout user flow to provide more instructions
This is a case where a simpler user flow, may actually cause greater confusion to the user. With more steps inserted, users are more confident in performing tasks.
Providing instructions on what to do after self-checkout payment
The revised self-checkout user flow may seem a lot more complicated, but provides greater clarity to users. This is because instructions are given to them to proceed to the Express Packing Counter to get their items packed, and the security tags removed. Without these instructions, users were unsure what to do after making a payment.

Promoting the App Usage

Through our app redesign, we created opportunities where users can use the app within the physical store. Hence, to encourage the usage, this should be accompanied by various promotional materials around the stores.
Clothes Tag and InStore Posters
For example, the clothes tag can include a line to inform users that they can scan and perform a self-checkout with the app. This can also be included in the signages found throughout the store.
Express Packing Counter for self-checkout users
As users have to get their items packed and security sensors removed, we propose setting up an Express Packing Counter lanes that will be quicker in serving these customers. This will help to bring about a greater awareness of the app.

Future Steps

In the short term:
  1. We propose to include features that will help users to find what they need. For example, we can include an image search feature so that users can find a similar style.
  2. Personalised feed for signed in users based on gender and body size to suggest the right style and promotions.
In the long term:
  1. Align the current website with the new app after collecting user feedback for the new app.
  2. Rearrange products in the navigation based on knowledge of future product inventory.

Points to Note

The design of the app in the InVision prototype does not follow the guidelines listed in the iOS Human Interface Guidelines. This was due to my unfamiliarity with iPhone app design. After studying the guide, I redesigned the app to match the style specified for iPhone 7.
The main difference is in the system font choice (SF Pro Display) and in the navigation labels. This is to ensure consistency throughout the iPhone.
Revising the navigation to match Apple iOS Human Interface Guidelines

Special thanks to:

Team mates, Parul Shukla & Cheryl Lee,
Instructor, Nie Zhen Zhi,
and Teaching Assistant, Wilson Chew

Friday, January 5, 2018

How we recreated Amazon Go in 36 hours


John Choi, me, our project apparatus, Ruslan Nikolaev, and Soheil Hamidi at our demo!
My colleagues and I wanted to create something that would make people go “wow” at our latest hackathon.
Because imitation is the sincerest form of flattery and IoT is incredibly fun to work with, we decided to create our own version of Amazon Go.
Before I explain what it took to make this, here’s the 3 minute demo of what we built!
There were four of us. Ruslan, a great full-stack developer who had experience working with Python. John, an amazing iOS developer. Soheil, another great full-stack developer who had experience with Raspberry Pi. And finally, there was me, on the tail end of an Android developer internship.
I quickly realized that there were a lot of moving parts to this project. Amazon Go works on the basis of real-time proximity sensors in conjunction with a real-time database of customers and their carts.
We also wanted to take things a step further and make the entry/exit experience seamless. We wanted to let people enter and exit the store without needing to tap their phones.
In order to engage users as a consumer-facing product, our app would need a well-crafted user interface, like the real Amazon Go.
On the day before the hackathon, I put together a pseudo-design doc outlining what we needed to do within the 36 hour deadline. I incorporated the strengths of our team and the equipment at hand. The full hastily assembled design doc can be seen below.

There were six main components to EZShop, our version of Amazon Go.
A quick diagram I whipped up visualizing the components of this project

The Kairos Facial Recognition API

The Kairos facial recognition API was a fundamental component for us. It abstracted the ability to identify and store unique faces. It had two APIs that we used: /enroll and /verify.
/enroll is described as:
Takes a photo, finds the faces within it, and stores the faces into a gallery you create.
We enrolled all new customers into a single “EZShop” gallery. A unique face_id attribute would be returned and stored with the customer’s registered name in our real-time database.
When we wanted to verify a potential customer’s image, we would POST it to the /verify endpoint. This would return the face_id with the highest probability of a match.
In a real-world implementation, it probably would have been a better idea to use a natively implemented facial recognition pipeline with TensorFlow instead of a network API. But given our time constraints, the API served us very well.

The Realtime Firebase Database

The Firebase database was another fundamental piece to our puzzle. Every other component interacted with it in real time. Firebase allows customized change listeners to be created upon any data within the database. That feature, coupled with the easy set-up process, made it a no brainer to use.
The schema was incredibly simple. The database stored an array of items and an array of users. The following is an example JSON skeleton of our database:
{
  "items": [
    {
      "item_id": 1,
      "item_name": "Soylent",
      "item_stock": 1,
      "price": 10
    }
  ],
  "users": [
    {
      "face_id": 1,
      "name": "Subhan Nadeem",
      "in_store": false,
      "cart": [
        1
      ]
    }
  ]
}
New users would be added to the array of users in our database after registering with the Kairos API. Upon entry or exit, the customer’s boolean in_store attribute would be updated, which would be reflected in the manager and personal app UIs.
Customers picking up an item would result in an updated item stock. Upon recognizing which customer picked up what item, the item’s ID would be added to the customer’s items_picked_up array.
I had planned for a cloud-hosted Node/Flask server that would route all activity from one device to another, but the team decided that it was much more efficient (although more hacky) for everybody to work directly upon the Firebase database.

The Manager and Personal Customer Apps

John, being the iOS wizard that he is, finished these applications in the first 12 hours of the hackathon! He really excelled at designing user-friendly and accessible apps.

The Manager App


This iPad application registered new customers into our Kairos API and Firebase database. It also displayed all customers in the store and the inventory of store items. The ability to interact directly with the Firebase database and observe changes made to it (e.g. when a customer’s in_store attribute changes from true to false) made this a relatively painless process. The app was a great customer-facing addition to our demo.

The Personal Shopping App


Once the customer was registered, we would hand a phone with this app installed to the customer. They would log in with their face (Kairos would recognize and authenticate). Any updates to their cart would be shown on the phone instantly. Upon exiting the store, the customer would also receive a push notification on this phone stating the total amount they spent.

The Item Rack, Sensors, and Camera

Soheil and Ruslan worked tirelessly for hours to perfect the design of the item shelf apparatus and the underlying Pi Python scripts.
The item rack apparatus. Three items positioned in rows, a tower for the security camera, and ultrasonic sensors positioned at the rear
There were three items positioned in rows. At the end of two rows, an ultrasonic proximity sensor was attached. We only had two ultrasonic sensors, so the third row had a light sensor under the items, which did not work as seamlessly. The ultrasonic sensors were connected to the Raspberry Pi that processed the readings of the distance from the next closest object via simple Python scripts (either the closest item or the end of the rack). The light sensor detected a “dark” or “light” state (dark if the item was on top of it, light otherwise).
When an item was lifted, the sensor’s reading would change and trigger an update to the item’s stock in the database. The camera (Android phone) positioned at the top of the tower would detect this change and attempt to recognize the customer picking up the item. The item would then instantly be added to that customer’s cart.

Entrance and Exit Cameras

I opted to use Android phones as our facial recognition cameras, due to my relative expertise with Android and the easy coupling phones provide when taking images and processing them.
The phones were rigged on both sides of a camera tripod, one side at the store’s entrance, and the other at the store exit.
A camera tripod, two phones, and lots of tape
Google has an incredibly useful Face API that implements a native pipeline for detecting human faces and other related useful attributes. I used this API to handle the heavy lifting for facial recognition.
In particular, the API provided an approximate distance of a detected face from the camera. Once a customer’s face was within a close distance, I would take a snapshot of the customer, verify it against the Kairos API to ensure the customer existed in our database, and then update the Firebase database with the customer’s in-store status.
I also added a personalized text-to-speech greeting upon recognizing the customer. That really ended up wowing everybody who used it.
The result of this implementation can be seen here:
Once the customer left the store, the exit-detection state of the Android application was responsible for retrieving the items the customer picked up from the database, calculating the total amount the customer spent, and then sending a push notification to the customer’s personal app via Firebase Cloud Messaging.

Of the 36 hours, we slept for about 6. We spent our entire time confined to a classroom in the middle of downtown Toronto. There were countless frustrating bugs and implementation roadblocks we had to overcome. There were some bugs in our demo that you probably noticed, such as the cameras failing to recognize several people in the same shot.
We would have also liked to implement additional features, such as detecting customers putting items back on the rack and adding a wider variety of items.
Our project ended up winning first place at the hackathon. We set up an interactive booth for an hour (the Chipotle box castle that can be seen in the title picture) and had over a hundred people walk through our shop. People would sign up with a picture, log into the shopping app, walk into the store, pick up an item, walk out, and get notified of their bill instantly. No cashiers, no lines, no receipts, and a very enjoyable user experience.
Walking a customer through our shop
I was proud of the way our team played to each individual’s strengths and created a well put-together full-stack IoT project in the span of a few hours. It was an incredibly rewarding feeling for everybody, and it’s something I hope to replicate in my career in the future.
I hope this gave you some insight into what goes on behind the scenes of a large, rapidly prototyped, and hacky hackathon project such as EZShop.

Interested for our works and services?
Get more of our update !