Programming & IT Tricks . Theme images by MichaelJay. Powered by Blogger.

Copyright

Facebook

Post Top Ad

Search This Blog

Post Top Ad

Responsive Ads Here

Archive

Post Top Ad

Contact


Editors Picks

Follow us

Post Top Ad

Fashion

Music

News

Sports

Food

Technology

Featured

Videos

Fashion

Technology

Fashion

Label

Translate

About

Translate

Sponsor

test

Weekly

Comments

Recent

Connect With us

Over 600,000+ Readers Get fresh content from FastBlog

About

Showing posts with label Alexa. Show all posts
Showing posts with label Alexa. Show all posts

Monday, January 8, 2018

UI/UX Case Study: Mobile Self-Checkout App Design Concept


Fashion Retail and E-Commerce App Redesign

The UI/UX case study documents the processes involved in a redesign of a fashion retail and e-commerce app. The app includes a product scan feature for customers to perform a self-checkout at a physical store. This design sprint took 11-days to complete and is submitted to the UXDI course at General Assembly, Singapore.

Project Brief

Work in a team to identify problems and/or opportunities with an existing mobile application and utilise your knowledge to design a solution.
For this project, my team selected the Uniqlo, Singapore app to redesign. For this documentation, the brand name will not be mentioned again below. The ideas below apply to most fashion retailers with an e-commerce presence.

Overview

The 11-days group project (3 members) includes the following processes and methodologies:
#1 Discover
  • Background research
  • Contextual inquiry
  • User Interviews
  • Online surveys
  • Competitive analysis
  • Heuristic evaluation
#2 Define
  • Affinity mapping
  • User personas
  • Customer journey mapping
  • Feature prioritisation
  • Design studio
#3 Design
  • Wireframe
  • InVision prototype
  • Visual mock-up
#4 Testing
  • Usability testing
  • System Usability Scale study
#5 Deliver
  • Interactive prototype
  • Visual mock-up
  • Research report
  • Presentation

The Context — Competitive Retail Scene & Mobile Payment in Singapore

News clippings from The Straits Times and Channel NewsAsia
It is increasingly difficult for retail businesses to remain competitive in Singapore. This is due to the high rental fees to maintain a physical store and the difficulty in hiring lower-skilled sales assistants.
In addition, consumers are increasingly shopping online on platforms such as Taobao, ASOS, ZALORA for their fashion fix.
In the recent Singapore National Day Rally Speech (Aug 17), the prime minister pushes for consumers and retailers to adopt mobile payments. This will be done through initiatives such as ‘PayNow’ and a common national QR code.
Based on this setting, my team picked a retail outlet with a physical and online (mobile app) presence as our project.

Heuristic Evaluation

Screen grab from heuristic evaluation report–Consistency and Standards
First, we started by analysing the existing app to identify key problems and issues. This was compared to online reviews on Google Play and the App Store.
App reviews on Google Play and Apple App Store
The main issues discovered was the app is a hybrid app, i.e. it is pulling information from a web page. This creates potential issues:
  1. Slow loading as most of the information is downloaded only when needed.
  2. The experience is not catered to mobile. Fonts, buttons, and images appear too small on the mobile phone.
  3. Navigation is inconsistent throughout the app.

Competitive Analysis

Competitive Analysis–Comparing features on the Home screen
Next, we compared the app to the competitors’ apps. The competitors are determined by these 4 points:
  1. Fashion retailers with a physical store in Singapore;
  2. Has an e-commerce mobile app;
  3. Similar price range and demographics;
  4. Fast-fashion retailer.

Key findings identified were:

  1. Most shoppers do not know of or use the apps;
  2. No in-store signages were found to encourage its usage;
  3. Competitors have a barcode scanner to provide additional product information. This feature integrates the in-store and mobile experience;
  4. Competitors have a significantly better app presentation as it feels less cluttered;
  5. App approaches may be different — one is more editorial, while the other is focused on e-commerce.

The Big Questions — How Might We…

At the start of the project, we had three main questions in mind.
How might we…
  1. adapt the physical store experience into a mobile experience?
  2. use a mobile app to further enhance the physical store experience?
  3. adopt mobile payment or a mobile self-checkout at a physical store?

Defining the In-store Experience

First, we define the unique experience at the physical store to adapt it to the mobile app.
  • Greeted with ‘Welcome to (the store)’ every time you enter the shop;
  • Same familiar shop layout at every outlet;
  • Wide open aisle, bright lights, neatly stacked shelves;
  • Sales and promotions throughout the year for different products each week;
  • Easy to find the right sizes without help from a sales assistant;
  • Strong visual branding from clothes tag to signages.

Contextual Inquiry/Field Study

Contextual inquiry at the physical store
We conducted a field study at an outlet by speaking to customers and shop assistants. Also, we showed the product scan feature found on our competitor’s mobile app.

What we noticed and found out:

  1. Shoppers do not know of the app even though they frequently shop at the store.
  2. The current app is for e-commerce only.
  3. Shoppers will consider shopping online after knowing of the app.
  4. Shoppers are wowed by the product scan feature as the technology is fascinating. The same technology is already available at a kiosk at the flagship store (in the city). Other smaller outlets (in the neighbourhoods) did not have this kiosk, probably due to space constraints.
  5. The same product might be cheaper in the app as there are mobile exclusive discounts. This, although they may incur additional delivery fees ($6 for spendings < $30).
  6. Various products are available only at the flagship store or on the mobile app.
  7. Long queues were observed at the store during peak hours.

User Interviews

Sample interview questions grouped by topics
We interviewed 7 users to find out what they think of the current app. The questions we asked were centred on various touch points common to fashion e-commerce apps. For example, we asked questions related to:
  1. browsing for clothes,
  2. making a purchase,
  3. waiting for the delivery,
  4. receiving the items,
  5. and making returns.

Key findings from the interviews:

  1. The app is easy to browse, hence there are no major issues with the navigation. The only issue is with ‘dresses’ being classified under ‘tops’.
  2. Frustration comes from the lack of filtering and a complicated check-out process.
  3. The app lacks clarity in the delivery options and fees.
  4. The app presentation is messy.

Affinity Mapping

Existing App User
Shopper–Potential App User
After conducting user interviews and contextual inquiry, the next step we did was to organise the insights into groups in an Affinity Map. With this map, we could identify common habits, problems, and pain points. The map also helped us to identify 2 key personas (elaborated below) where the same coloured post-its are usually grouped together. Eg. Red and pale blue posts-its are existing users.

User Personas and Customer Journey Map

Based on the patterns identified in the affinity map, we came out with 2 personas — an existing user of the app, and a current shopper who is a potential user of the app. These personas describe a typical user/potential user, their habits, problems, pain points, and other details about him/her.
Persona 1 — Existing user of the app
User Persona–Existing user of the app
Customer Journey Map–Shopping on the App
Katie prefers to shop online and is an existing user of the app. She wants quick access to all the discounts and finds it difficult to find the size and availability of the items she wants. While she is familiar and comfortable using the app, she hopes the user experience can be improved.
Persona 2 — Existing shopper and potential user of the app
User Persona–Shopper at physical store/Potential user of app
Customer Journey Map–Shopping at a physical store
Natalie shops at the physical store and is not aware of the existing app. While she enjoys shopping at the store, there are often long queues at the payment counter. She may be a potential user of the app since she uses other e-commerce apps to shop for clothes.

Potential project approaches:

  1. The redesign should not affect current users of the app. Navigation should be kept similar to the existing app and website.
  2. New features can be added to the app for current shoppers to use it in-store.
  3. Users should be able to access ‘Promotions’ quickly since it is a major feature of the brand.
  4. Increase awareness of the app through in-store posters and other marketing efforts.

Feature Prioritisation Matrix

Feature Prioritisation Matrix–User Needs vs Business Needs
57% of users surveyed rated a 4 or more, that is important to have a self-checkout counter in-store
Through a design studio process, we came up with various new features we intend to include in the new app. To come out with a Minimal Viable Product (MVP, or Minimal Lovable Product, MLP) we conducted an online survey to find out what users want on the app. We looked at the features from the business perspective and organised them according to our user and business needs. Features at the top right corner (the box in red) are the ones that should be included in the new version of the app.

Storyboards

The new features are illustrated in storyboards, detailing the environment, scenario, and context where the app may be used.
Storyboard by Parul–Receiving a push-notification when user is near the store
Storyboard by Parul–Using the barcode scan and self-checkout function

Mid-Fi Prototype & First Usability Test

Mid-Fidelity Prototype by Parul
Since my team comprises two visual designers (myself included), we skipped to a mid-fidelity prototype after doing quick sketches. Visuals of the clothing may be important in helping users visualise the actual app.
The version was used for testing with actual customers on our second trip to the store. The purpose of the test is to determine if customers are receptive to the new scanning and self-checkout feature.
Key findings from the usability test:
  1. Customers are able to identify the scan feature and its uses.
  2. Most customers are able to expect what will appear after scanning the product.
  3. However, they questioned the need to know more product information when they have the physical item on hand.
  4. Customers will use the self-checkout ‘only when there is a queue’. This is to be expected since most Singaporeans are more comfortable making payment by cash at a counter.
  5. However, most highlighted there they are slowly accepting mobile payments and self-checkout systems as part of the future retail experience.
  6. The wishlist feature was removed subsequently as users do not require the function.

Hi-Fi Prototype

From the usability test, we iterated a high fidelity prototype. The branding was also enhanced in the design by using the right fonts and colours. The interactive prototype can be viewed on InVision.

Feature Demonstration

Scan Feature

We created a video to show the new scan feature on the app since it was impossible to prototype the actual feature.
App Scan Feature Prototype Demo

Delivery Target Bar

Another feature on the app is an animated target bar for free delivery. This will encourage users to spend more to meet the target while providing greater clarity to the users.
Animated delivery target bar

Geo-Fencing Push Notification

Users will receive a mobile coupon through push notification when their GPS indicates that they are near a store outlet. This will encourage them to use the app for self-checkout.
Receiving a mobile voucher through push notification when user is near an outlet

Usability Testing

Tasks assigned for usability tests
Participants were given 4 tasks to complete. Task 1 was conducted on both the existing app and the new app. The clicks for 3 of the tasks were illustrated below.
Where did the users click?
Time taken by user to find a dress on the current and new app
To collect quantitative data, we timed users on how fast they took to complete the task on the existing app and new app. The new design allows the user to complete the task more efficiently.
Quantitative Data from System Usability Scale(SUS)
In addition, we conducted a post-test survey to collect feedback from the participants on their views of the new app. This was done with the System Usability Scale(SUS) test. The results were tabulated and calculated based on the method specified by the standardised test.

Results from system usability scale test:

  • Users rated 69/100 (marginal) on their opinion of the new app.
  • Although this is below the acceptable score (>70), it was not a bad score.
  • The marginal score was due to the difficulty in performing task 4 (i.e. performing a self-checkout).
  • Designing a self-checkout is a challenging task due to the lack of existing models to follow. Users need time to learn and accept self-checkout methods.

Design Iteration — Improving the User Flow

After the usability test, we discovered that users were confused by the product detail page after scanning the barcode. They assumed that the item was already added to the cart after the scan.

Scanning Products

Revising the user flow to provide more feedback
Providing feedback to guide users in completing their task
We made the process more informative for users by providing feedback on what is happening. First, we prompt users if they want to add the item to the bag after scanning the barcode. Next, we gave them the option to continue scanning or proceed to the shopping bag. This provides more clarity to the user as they are provided with options to proceed to the next step.

Self-Checkout

Revising the self-checkout user flow to provide more instructions
This is a case where a simpler user flow, may actually cause greater confusion to the user. With more steps inserted, users are more confident in performing tasks.
Providing instructions on what to do after self-checkout payment
The revised self-checkout user flow may seem a lot more complicated, but provides greater clarity to users. This is because instructions are given to them to proceed to the Express Packing Counter to get their items packed, and the security tags removed. Without these instructions, users were unsure what to do after making a payment.

Promoting the App Usage

Through our app redesign, we created opportunities where users can use the app within the physical store. Hence, to encourage the usage, this should be accompanied by various promotional materials around the stores.
Clothes Tag and InStore Posters
For example, the clothes tag can include a line to inform users that they can scan and perform a self-checkout with the app. This can also be included in the signages found throughout the store.
Express Packing Counter for self-checkout users
As users have to get their items packed and security sensors removed, we propose setting up an Express Packing Counter lanes that will be quicker in serving these customers. This will help to bring about a greater awareness of the app.

Future Steps

In the short term:
  1. We propose to include features that will help users to find what they need. For example, we can include an image search feature so that users can find a similar style.
  2. Personalised feed for signed in users based on gender and body size to suggest the right style and promotions.
In the long term:
  1. Align the current website with the new app after collecting user feedback for the new app.
  2. Rearrange products in the navigation based on knowledge of future product inventory.

Points to Note

The design of the app in the InVision prototype does not follow the guidelines listed in the iOS Human Interface Guidelines. This was due to my unfamiliarity with iPhone app design. After studying the guide, I redesigned the app to match the style specified for iPhone 7.
The main difference is in the system font choice (SF Pro Display) and in the navigation labels. This is to ensure consistency throughout the iPhone.
Revising the navigation to match Apple iOS Human Interface Guidelines

Special thanks to:

Team mates, Parul Shukla & Cheryl Lee,
Instructor, Nie Zhen Zhi,
and Teaching Assistant, Wilson Chew

Thursday, January 4, 2018

I Was Supposed to be an Architect


I’m leading a VR development studio, but the truth is I’ve been navigating a series of epic career learning curves that have taken me far outside of my comfort zone, and I wouldn’t have it any other way.
Mainstreet, Mall or Modem
On my quest to start sharing more about our process and lessons learned on the virtual frontier, I thought I’d start with a bit of background on how I arrived here in the first place.
I studied and practiced architecture, but I’ve been fascinated with virtual technologies as far back as I can remember. In fact, my architectural thesis project in grad school (image above) focused on how VR and digital technologies would someday revolutionize architecture — specifically retail architecture. This was 17 years ago, when VR was very expensive, and largely inaccessible, but the brilliant pioneers at work innovating in this field were demonstrating the massive potential. It was only a matter of time before VR would find a way to mainstream.
Like so many other physical manifestations, from music to books and beyond, I believe buildings are subject to a similar digital transcendence. It’s already happening in a pretty big way, and this is just the beginning of a major architectural transformation that might take another decade or two to fully surface, but I digress… I’m saving this interest for a future pivot, and almost certainly another epic learning curve to go with it.
I tried using Everquest to visualize architecture.
I had a level 47 Dark Elf Shadow Knight in Everquest, but spent most of my time wandering around, exploring the environments. What I really wanted to do was import my own architectural models to explore them inside the game.
If they could have such elaborate dungeons and forts to explore in Everquest, with people from all around the world working together in the game virtually, why couldn’t the same technology also be used to visualize a new construction project, with the architect, building owner, and construction team exploring or collaborating on the design together?
This quest to visualize architecture in a real-time world became a ‘first principle’ in my career path that I’ve been chasing ever since.
I met my amazing and tremendously patient wife, Kandy, in grad school, and after studying architecture together in Europe and graduating, we practiced architecture for some time before starting our own firm, Crescendo Design, focused on eco-friendly, sustainable design principles.
Then one day in 2006, I read an article in Wired about Second Life — a massively multi-player world where users could create their own content. Within an hour, I was creating a virtual replica of a design we had on the boards at the time. I had to use the in-world ‘prims’ to build it, but I managed.
I was working in a public sandbox at the time, and when I had the design mostly finished, I invited the client in to explore it. They had 2 young kids, who were getting a huge kick out of this watching over their parent’s shoulders as they walked through what could soon be their new home.
The Naked Lady, the Sheriff Bunny, and Epic Learning Curve #1.
We walked in the front door, when suddenly a naked woman showed up and started blocking the doorways. I reported her to the ‘Linden’ management, and a little white bunny with a big gold sheriff’s badge showed up and kicked her out. “Anything else I can help with?” Poof.. the bunny vanished and we continued our tour. That’s when I realized I needed my own virtual island (and what an odd place Second Life was).
But then something amazing happened that literally changed my career path, again.
I left one of my houses in that public sandbox overnight. When I woke up in the morning and logged in, someone had duplicated the house to create an entire neighborhood — and they were still there working on it.
Architectural Collaboration on Virtual Steroids
I walked my avatar, Keystone Bouchard, into one of the houses and found a group of people speaking a foreign language (I think it was Dutch?) designing the kitchen. They had the entire house decorated beautifully.
One of the other houses had been modified by a guy from Germany who thought the house needed a bigger living room. He was still working on it when I arrived, and while he wasn’t trained in architecture, he talked very intelligently about his design thinking and how he resolved the new roof lines.
I was completely blown away. This was architectural collaboration on virtual steroids, and opened the door to another of the ‘first principle’ vision quests I’m still chasing. Multi-player architectural collaboration in a real-time virtual world is powerful stuff.
Steve Nelson, Jon Brouchoud, and Carl Bass delivering Keynote at Autodesk University 2006
One day Steve Nelson’s avatar, Kiwini Oe, visited my Architecture Island in Second Life and offered me a dream job designing virtual content at his agency, Clear Ink, in Berkeley, California. Kandy and I decided to relocate there from Wisconsin, where I enjoyed the opportunity to build virtual projects for Autodesk, the U.S. House of Representatives, Sun Microsystems and lots of other virtual installations. I consider that time to be one of the most exciting in my career, and it opened my eyes to the potential for enterprise applications for virtual worlds.
Wikitecture
I started holding architectural collaboration experiments on Architecture Island. We called it ‘Wikitecture.’ My good friend, Ryan Schultz, from architecture school suggested we organize the design process into a branching ‘tree’ to help us collaborate more effectively.
Studio Wikitecture was born, and we went on to develop the ‘Wiki Tree’ and one of our projects won the Founder’s Award and third place overall from over 500 entries worldwide in an international architecture competition to design a health clinic in Nyany, Nepal.
These were exciting times, but we were constantly faced with the challenge that we weren’t Second Life’s target audience. This was a consumer-oriented platform, and Linden Lab was resolutely and justifiably focused on growing their virtual land sales and in-world economy, not building niche-market tools to help architects collaborate. I don’t blame them — more than 10 years after it launched, it still has a larger in-world economy of transactions of real money than some small countries.
We witnessed something truly extraordinary there — something I haven’t seen or felt since. Suffice it to say, almost everything I’ve done in the years since have been toward my ultimate goal of someday, some way, somehow, instigating the conditions that gave rise to such incredible possibilities. We were onto something big.

Should Regulators Force Facebook To Ship a “Start Over” Button For Users?


I don’t really understand most of the proposals to “regulate” Facebook. There are some concrete proposals on the table regarding political ads and updating antitrust for the data age, but other punditry is largely consumer advocacy kabuki. For example, blunting the data Facebook can use to target ads or tune newsfeed hurts the user experience, and there’s really no stable way to draw a line around what’s appropriate versus not. These experiences are too fluid. But while I want keep the government out of the product design business, there’s an alternate path which has merit: establish a baseline for the control a person has over their data on these systems.
Today the platforms give their users a single choice: keep your account active or delete your account. Sure, some expose small amounts of ad targeting data and let you manipulate that, but on the whole they provide limited or no control over your ability to “start over.” Want to delete all your tweets? You have to use a third party app. Want to delete all your Facebook posts? Good luck with that. Nope, once you’re in the mousetrap, there’s no way out except account suicide.
BUT is that really fair? Over multiple years, we all change. Things we said in 2011 may or may not represent us today. And these services evolve — did we think we’d be using Facebook as a primary source of news consumption and private messaging back when you were posting baby photos? Did you think they’d also own Instagram, WhatsApp, Oculus and so on when you created accounts on those services? We’re the frogs, slow boiling in the pot of water.
What if every major platform was required to have something between Create Account and Delete Account? One which allows you to keep your user name but selectively delete the data associated with the account? For Facebook, you could have a set of individual toggles to Delete All Friend Connections, Delete All Posts, Delete All Targeting Data. Each of these could be used individually or together to give you a fresh start. Maybe you want to preserve your social graph but wipe your feed? Maybe you want to keep your feed but rebuild your graph.
Or for Twitter: Delete All Likes, Delete All Tweets, Delete All Follows, Delete All Targeting Data.
Or for YouTube: Delete All Uploads, Delete All Subscriptions, Delete All Likes, Delete All Targeting Data.
The technical requirements to develop these features are only complicated in the sense of making sure you’re deleting the data everywhere it’s stored, otherwise every product already support “null” state — it looks very much like a new account. This leads me to believe that the only reason these features don’t exist today are (a) it would be bad for business and (b) actual or perceived lack of consumer demand. Anecdotally, it feels like (b) is changing — more and more people I know wipe their tweets, talk about deleting their histories, and so on. Imagine the ability to stage a “DataBoycott” by clearing your history if you think Facebook is taking liberties with your privacy and such. This is what keeps power in check.

The Truth About The Old iPhone Slowdown Fiasco



For a few weeks now, I have wanted to write an article about the slow-down of old iPhones. This desire started when I was having online conversations about it. In those conversations, many people did not understand clearly what was happening.
Then I went to the Apple Store to get my MacBook Pro fixed. On the way, I discovered that my Lyft driver totally misunderstood the problem. The final straw was that the Apple technician at the Genius Bar also did not understand the problem.
Then Apple announced that it was offering a $50 discount on battery replacement for old iPhones, and I assumed that that was the end of the matter, and that this article would be redundant.
But no, people were still saying, “Apple slowed down old iPhones to make us buy new iPhones!”
Apple did not write iOS code that runs intentionally more slowly on older iPhones so that people would be forced to buy newer iPhones. Apple also did not get caught red-handed by people who found that replacing their battery sped-up their old iPhones.

What Actually Happened

In the most recent releases of iOS, Apple added an algorithm intended to increase battery life and reduce shut-downs related to brown-outs in phones containing old batteries. A brown-out is a sudden loss of power. The intention of this was to improve user experience on older phones.
An unfortunate, but somewhat anticipatable, result was that the old phones slowed down. This slow-down was noticed by some users who then reported it on social media. They discovered that replacing the batteries with new ones resolved the slowdown issue.
Apple then reported that it had added this new mechanism for managing old batteries. Apple’s claim was supported by the reports of new batteries fixing the problem. I also replaced my battery in my iPhone 6 Plus, and my phone functioned at a normal speed again.
All the evidence points to the fact that Apple was honestly trying to improve the experience for users of older phones containing older batteries. I imagine that it’s very hard to beta-test those battery management algorithms because it would require lots of old phones containing aging batteries. So Apple made a poor judgement call, and determined that releasing an iOS that extended battery life and reduced brown-outs on older batteries outweighed the potential, and perhaps unquantified, device slow-down effect.
Apple made a mistake, and when it discovered that it had made a mistake, it was open and transparent and offered a solution: to replace the batteries at a large discount ($50 off). Apple didn’t need to do this. Apple had done nothing wrong. Old batteries are old and don’t work as well, and Apple had been trying to improve the performance of those older batteries.
If Apple’s intention had been to simply slow down older models, it would have been much easier to make the new versions of iOS run more slowly on the older models, regardless of battery health. Determining the health of the battery and slowing down based on that would have been a completely unnecessary effort.

Conclusion

Apple acted with transparency and integrity. Apple was clearly trying to improve the experience on old iPhones, and it made a mistake. Yet most people seem to have heard Apple’s message and discounted almost all of it. Many people choose to hear only, “We slowed down old iPhones,” and then they insert a false additional message of “because we want people to buy the new models.”
There is no evidence to suggest that Apple intentionally slowed down old models to make people buy new models. It’s also extremely unlikely that Apple would employ such a lack-mentality to business. Apple creates and enlarges pies; Apple does not desperately grasp at existing pies. I’m sure that Apple wants us to buy the new models, but it’s focused on making the new models faster and richer featured.
Companies like Apple know that it’s not worth behaving unethically, because at some point those unethical behaviors will leak and harm the brand.

Interested for our works and services?
Get more of our update !