PRE2015 4 Groep2: Difference between revisions

From Control Systems Technology Group
Jump to navigation Jump to search
Line 158: Line 158:
==Further reading==
==Further reading==


* Agrobot video: https://youtu.be/RKT351pQHfI
* [https://youtu.be/RKT351pQHfI Agrobot video]


* Neural Networks Demystified: https://www.youtube.com/watch?v=bxe2T-V8XRs
* [https://www.youtube.com/watch?v=bxe2T-V8XRs Neural Networks Demystified]


* MIT Food Computer: https://www.youtube.com/watch?v=LEx6K4P4GJc
* [https://www.youtube.com/watch?v=LEx6K4P4GJc MIT Food Computer]


* Dickson Despommier on the vertical farm: https://www.youtube.com/watch?v=XIdP00u2KRA
* [https://www.youtube.com/watch?v=XIdP00u2KRA Dickson Despommier on the vertical farm]


* Agriculture is the fastest growing robotic sector: http://www.eetimes.com/document.asp?doc_id=1329273
* [http://www.eetimes.com/document.asp?doc_id=1329273 Agriculture is the fastest growing robotic sector]


* Japanese firm to open world’s first robot-run farm: http://www.theguardian.com/environment/2016/feb/01/japanese-firm-to-open-worlds-first-robot-run-farm
* [http://www.theguardian.com/environment/2016/feb/01/japanese-firm-to-open-worlds-first-robot-run-farm Japanese firm to open world’s first robot-run farm]


* Aeroponics (we most likely won’t use this as an irrigation method): https://en.wikipedia.org/wiki/Aeroponics
* [https://en.wikipedia.org/wiki/Aeroponics Aeroponics] (we most likely won’t use this as an irrigation method)  


* Polyculture: https://en.wikipedia.org/wiki/Polyculture
* [https://en.wikipedia.org/wiki/Polyculture Polyculture]


* Why to avoid monoculture: http://evolution.berkeley.edu/evolibrary/article/agriculture_02
* Why to avoid [http://evolution.berkeley.edu/evolibrary/article/agriculture_02 monoculture]


* LED lights for imitating sunlight: http://www.gereports.com/post/105532612260/the-future-of-agriculture-indoor-farms-powered-by/
* [http://www.gereports.com/post/105532612260/the-future-of-agriculture-indoor-farms-powered-by/ LED lights for imitating sunlight]


* Types of machine learning algorithms: http://machinelearningmastery.com/a-tour-of-machine-learning-algorithms/
* [http://machinelearningmastery.com/a-tour-of-machine-learning-algorithms/ Types of machine learning algorithms]


==Manual strawberry harvesting process==
==Manual strawberry harvesting process==

Revision as of 16:16, 2 May 2016

(Wiki markup cheatsheet)

We are developing an autonomous harvesting robot. The robot will be initially developed for harvesting strawberries. As creating a complete prototype is probably not feasible to do in nine weeks, we start with focusing on the detecting and sensing part. For that we will develop a system which scans fruits and determines their ripeness. It can also consider other factors like for example if the fruit looks appealing.

Group 2 members

  • Cameron Weibel (0883114)
  • Maarten Visscher (0888263)
  • Raomi van Rozendaal (0842742)
  • Birgit van der Stigchel (0855323)
  • Mark de Jong (0896731)
  • Yannick Augustijn (0856560)

Project description

To have a robot that can classify fruits based on their ripeness and appeal factors. The fruits are detected while on a transport belt. --or-- In the field.

Add problem description

Requirements

Functional requirements

  • The robot should be able to detect fruit using a Kinect camera.
  • It should be able to classify the ripeness of the fruit based on a convolutional neural network
  • The robot will query an online database about the ripeness of a certain fruit and the database will return the percentile of ripeness the fruit is in based on different fruit image sets
  • The farmer should be able to take pictures of overripe/underripe fruit to add to the training set to give feedback to improve the robot
  • The farmer should be able to interface with the database as well as different harvesting metrics through a mobile device.

Non-functional requirements

  • It should be relatively simple to add the Kinect+Raspberry Pi to an existing harvesting system.
  • The farmer should be able to use the system with minimal prior knowledge
  • The robot should perform better than a human quality controller
  • This robot should have all the safety features necessary to ensure no critical failures.

USE aspects

User

Primary users are farmers and their workers, who directly use the robot. The following aspects hold:

  • Their work becomes far less intensive and heavy. Instead of directly harvesting, farmers can let the robot do the work. They would now only occasionally need to check the harvest and possibly adjust some parameters. This work is less heavy than harvesting and therefore less health problems due to heavy work can be expected.
  • More free time for other things. This is because the new work takes far less time. Also there is no need anymore for training seasonal workers.

Secondary users are distributors that pick up the fruits from the farms. They use the robot occasionally when they need to get the fruits that are picked by the robot. Their work is mostly unaffected, however some parts of their work can be left to the robot, depending on how advanced the robot is. One of these things is selecting fruits based on ripeness and appealing factor. This can be done by the robot. The robot could also directly package and seal the fruits.

A tertiary user is the company that is developing and maintaining this robot. It indirectly uses the robot during development.

(I assume that this is a societal aspect:) Another tertiary user is the harvesting worker. A worker that is harvesting fruit manually does not directly come in contact with the robot. The robot does however influence these workers, as it takes away their jobs. This aspect should be researched more. These harvesting workers are likely people with a low education and students wanting to earn a little more. The people with low education can be expected to have a hard time finding a new job.

Society

Society .

Enterprise

Enterprise

Planning

Week 2
Clarifying our project goals
Working on USE aspects
Finalize planning and technical plan
Sketch a prototype

Week 3
Preliminary design for app
First implementation of app
Database/Server setup
CNN, and basics of neural networks

Week 4
App v1.0 with design fully implemented
Kinect interfacing to Raspberry Pi completed
USEing intensifies
Finish back-end design and choose frameworks

Week 5
App v2.0 with design fully implemented and tested
Further training of CNN
Working database classification (basic)
Casing (with studio lighting LED shining on fruit)

Week 6
Improve CNN
Expand training sets (outside of strawberries (if possible))
User testing on app
Reflect on USE aspects and determine if we still preserve our USE values


Week 7
Implement feedback from testing app
Improve aesthetic appeal detection (if time)

Week 8
Finish everything
Buffer period
Final reflection on USE value preservation


Week 9
Improve wiki for evaluation
Peer review

Fallback: App for user to report feedback in the form of images of high/low quality fruit.
Have Rpi take pictures using Kinect and send to database
Choose a more binary classification (below 50%/above 50% quality)
Flesh out the design more (if implementation fails)

Technical aspects

Database

Application

Application design (User interface)

Research

Harvesting robots

  • Paper from 1993 describing the then state-of-the-art and economic aspects. It has a chapter on economic evaluation.
  • Recent TU/e paper discussing the state-of-the-art on tomato harvesting. It focuses on the mechanical part and does not include sensing and detecting.

Sensing technology

(Older)

  • Yamamoto, S., et al. "Development of a stationary robotic strawberry harvester with picking mechanism that approaches target fruit from below (Part 1)-Development of the end-effector." Journal of the Japanese Society of Agricultural Machinery 71.6 (2009): 71-78. Link
  • Sam Corbett-Davies , Tom Botterill , Richard Green , Valerie Saxton, An expert system for automatically pruning vines, Proceedings of the 27th Conference on Image and Vision Computing New Zealand, November 26-28, 2012, Dunedin, New Zealand Link
  • Hayashi, Shigehiko, Katsunobu Ganno, Yukitsugu Ishii, and Itsuo Tanaka. "Robotic Harvesting System for Eggplants." JARQ Japan Agricultural Research Quarterly: JARQ 36.3 (2002): 163-68. Web. Link
  • Blasco, J., N. Aleixos, and E. Moltó. "Machine Vision System for Automatic Quality Grading of Fruit." Biosystems Engineering 85.4 (2003): 415-23. Web. Link
  • Cubero, Sergio, Nuria Aleixos, Enrique Moltó, Juan Gómez-Sanchis, and Jose Blasco. "Advances in Machine Vision Applications for Automatic Inspection and Quality Evaluation of Fruits and Vegetables." Food Bioprocess Technol Food and Bioprocess Technology 4.4 (2010): 487-504. Web. Link
  • Tanigaki, Kanae, et al. "Cherry-harvesting robot." Computers and Electronics in Agriculture 63.1 (2008): 65-72. Direct Dianus
    • Evaluation of a cherry-harvesting robot. It picks by grabbing the peduncle and lifting it upwards.
  • Hayashi, Shigehiko, et al. "Evaluation of a strawberry-harvesting robot in a field test." Biosystems Engineering 105.2 (2010): 160-171. Direct Dianus
    • Evaluation of a strawberry-harvesting robot.

State of the art

A small number of tests have been done with machines for harvesting strawberries. These are large, bulky and expensive machines like Agrobot. Cost prices are in the order of 50,000 dollar. Todo: add citations.

A lot of research is done towards inspection by means of machine vision. Todo: add citations and continue.

Further reading

  • Aeroponics (we most likely won’t use this as an irrigation method)

Manual strawberry harvesting process

Source (move to citation)

Meetings

Moved to Talk:PRE2015_4_Groep2.