Ideating + paper prototyping
At this workshop, we were asked to redesign the fictitious app Sproutly. Sproutly was designed for people with small gardens to show gardeners when plants need to be harvested or if they need sun or water.
Before the workshop, a prototype was shown to a user who gave this feedback:
• She liked that she could see individual plants.
• But she didn’t know what she was supposed to do or see.
• And she couldn’t see what parts of the garden needed attention.
• She wanted it to be “really easy to see what plants need care.”
Inspired by thermal-imaging camera technology and the evolving capabilities of sensors and motion-detection cameras, I thought of a mobile app that could display problem areas in the landscape in real time.
My app would also be scalable to almost any garden as long as it could be perceived by a phone’s lens and fitted with sensors, etc.
This may sound futuristic but this tech is almost available (IoT; smart devices; sensors that detect moisture, motion, dryness). If ripeness is considered to be determinable only through visible attributes, I suggest motion-detection cameras be used and machine learning used to interpret that video data from them over 24 hours. That data will show minute but important changes in plant state and size which AI can distill into recommended ripeness/picking time and translate into a phrase on a user’s phone.