Talk:PRE2015 4 Groep2: Difference between revisions

From Control Systems Technology Group
Jump to navigation Jump to search
 
(36 intermediate revisions by the same user not shown)
Line 1: Line 1:
==Minutes==
=Minutes=
 
==Week 1==
===Meeting 20--04===
===Meeting 20-04===
Goal: Create a demo in which a robotic system is able to detect ripe strawberries and harvest them effectively.  
Goal: Create a demo in which a robotic system is able to detect ripe strawberries and harvest them effectively.  


Line 112: Line 112:
Phase-out period
Phase-out period


==Week 2==
===Meeting 25-04===
===Meeting 25-04===


Idea in steps:
'''Idea in steps:'''




Problem description:
Problem description:


Less people needed to work on farms, they can work in other fields (like help elderly)
Less people needed to work on farms, they can work in other fields (like help elderly)<br />
 


Facts and figures!
Facts and figures!


Move away from 3 group structure, no<br />
The whole group working on the use aspect first? <br />
The groups should not be too large,<br />


Move away from 3 group structure, no
We should first redesign the whole system with the focus of the way the USE aspects interact and change the design (as seen from a purely technical perspective) <br />
The whole group working on the use aspect first?
The groups should not be too large,
 
We should first redesign the whole system with the focus of the way the USE aspects interact and change the design (as seen from a purely technical perspective)  


We should specify more requirements with respect to the USE aspects, not as much from the technical side.
We should specify more requirements with respect to the USE aspects, not as much from the technical side.<br />


Users:
'''Users:<br />'''
-Primary:
-Primary:<br />
-Farmers:
:-Farmers:<br />
-Higher efficiency
:: -Higher efficiency<br />
-More harvest
:: -More harvest<br />
-Lower costs due to less employees
:: -Lower costs due to less employees<br />
-They will have more time
:: -They will have more time<br />
-Less physical work / Health benefits
:: -Less physical work / Health benefits<br />
-Don’t have to train seasonal workers
:: -Don’t have to train seasonal workers<br />
-Secondary:
-Secondary:<br />
-Supermarkets, Distributors:
:-Supermarkets, Distributors:<br />
-Higher quality food
:: -Higher quality food<br />
-Lower cost for food
:: -Lower cost for food<br />
-There will be more fresh food available(maybe?)
:: -There will be more fresh food available(maybe?)<br />
-More efficient supply chain
:: -More efficient supply chain<br />
-The supermarkets might also be able to give feedback.
:: -The supermarkets might also be able to give feedback.<br />
-Tertiary:
-Tertiary:<br />
-Company that creates the robots and maintains them:
:-Company that creates the robots and maintains them:<br />
-
-


'''Society<br />'''
:-There will not be enough food in the near future for all the people.<br />
:-Not enough workers, due to aging. <br />
:-Decrease in wage gap due to overabundance of food <br />
:-Post scarcity<br />


Society
-There will not be enough food in the near future for all the people.
-Not enough workers, due to aging.
-Decrease in wage gap due to overabundance of food
-Post scarcity
Enterprise:
-Large farms will start to dominate the market which will result in an increase in the gap between rich and poor.
-Lower worker costs
-Lower food prices might result in more sales
Results from USE
-The supermarket might want to have a say in which fruits are ripe.
-Direct and faster communication
-We can better specify the amount of food that we need.
-There is a lot of food waste, even due to food not looking good.
-Robots can determine what the best use for fruits is  based on many factors (e.g. looks)
-Farmer feedback:
-Via his/her smartphone.
-Online database for machine learning
-Many farmers can have access -> lots of pictures means high accuracy
-Farmers and secondary users should have a separate application
-Farmer should have more control (e.g. stop button)
-Greenhouses
-Robots save space
-Verticality
-More plants
-Can work 24/7
Society idea:


-every buyer: can indicate which fruit it wants and how many at which time.
'''Enterprise:<br />'''
-every farmer: can indicate which fruits it can deliver at what time.
:-Large farms will start to dominate the market which will result in an increase in the gap between rich and poor.<br />
:-Lower worker costs<br />
:-Lower food prices might result in more sales<br />


'''Results from USE<br />'''
:-The supermarket might want to have a say in which fruits are ripe.<br />
:: -Direct and faster communication<br />
:-We can better specify the amount of food that we need.<br />
:-There is a lot of food waste, even due to food not looking good.<br />
::-Robots can determine what the best use for fruits is  based on many factors (e.g. looks)<br />
:-Farmer feedback:<br />
::-Via his/her smartphone.<br />
::-Online database for machine learning<br />
:::-Many farmers can have access -> lots of pictures means high accuracy<br />
::-Farmers and secondary users should have a separate application<br />
:::-Farmer should have more control (e.g. stop button)<br />
:-Greenhouses<br />
:-Robots save space<br />
::-Verticality<br />
::-More plants<br />
::-Can work 24/7<br />


'''Society idea:<br />'''
:-every buyer: can indicate which fruit it wants and how many at which time.<br />
:-every farmer: can indicate which fruits it can deliver at what time.<br />


 
Raomi: Society part (needs with h2020)<br />
Raomi: Society part (needs with h2020)
Birgit: identify supply chain<br />
Birgit: identify supply chain
Cameron: technical plan<br />
Cameron: technical plan
Yannick: planning<br />
Yannick: planning
Maarten: User part<br />
Maarten: User part
Mark: Enterprise part<br />
Mark: Enterprise part


===Meeting 28-04===
===Meeting 28-04===
Line 213: Line 207:


'''Idea:''' <br />
'''Idea:''' <br />
- 1 system with Kinect, Raspberry Pi, … to classify fruit. Connects to database for classifying fruit.
- 1 system with Kinect, Raspberry Pi, … to classify fruit. Connects to database for classifying fruit.<br />
- Encasing with custom lighting for equal light situations.
- Encasing with custom lighting for equal light situations.<br />
- Cloud database for the pictures with machine learning.
- Cloud database for the pictures with machine learning.<br />


Problem: images taken by different types of camera's with different color. Possible solution: one type of tablet. Other possible solution: only use Kinect for sensing (something else for interface). Thus we have only 1 system. This solution is chosen.
Problem: images taken by different types of camera's with different color. Possible solution: one type of tablet. Other possible solution: only use Kinect for sensing (something else for interface). Thus we have only 1 system. This solution is chosen.<br />


Teams: <br />
Teams: <br />
Line 227: Line 221:
- During the project make sure that the user is taken into consideration (in design).
- During the project make sure that the user is taken into consideration (in design).


https://www.mendix.com/application-platform-as-a-service/mobile-paas/
[https://www.mendix.com/application-platform-as-a-service/mobile-paas/]
 
==Week 3==
===Meeting 02-05===
'''Focus'''
Isolated fruit detection in controlled lighting
 
'''Tasks'''<br />
Learning about CNN and basics, implement a non-fruit NN (look into TensorFlow) - MV, CW, MdJ<br />
Read literature about previously implemented CNNs - MV, CW, MdJ<br />
Basic CNN, demonstrate fruit detection- CW<br />
Create a basic API to POST images and catch the response - MV<br />
Screenflow for App - Raomi<br />
USE aspects of App, why does app help the solution? - YA, RvR<br />
Basic visuals of app with walkthrough - BvdS, YA<br />
Look into Mendix - BvdS<br />
 
Find a U, S, and E to interview about automation in their process/harvesting as well as our solution and how it impacts their sector. -MV ,U(E), BvdS (U,S)
 
==Week 4==
===Meeting 09-05===
'''Updated Planning:'''<br />
'''-Before thursday 12-05'''<br />
:Maarten: Find local farmer
:How we will define color?
:Which lighting conditions do we want?
:Mark: Why CNN? (compared with for example support vector machine)
:Yannick: Powerpoint of app
:Birgit: First version of the app
:LED light ring
 
'''-Before Thursday 19-05'''<br />
:Cameron, Maarten, Mark: Gathering database with rasberry pi and kinect
:Farmer: [http://www.kwekerij-jansen.nl/assortiment]
:Farmer: Needed to assure the use of our product
 
'''-Week 3''' <br />
:Preliminary design for app
:First implementation of app
:Database/Server setup
:CNN, and basics of neural networks
 
'''-Week 4''' <br />
:App v1.0 with design fully implemented
:Kinect interfacing to Raspberry Pi completed
:USEing intensifies
:Finish back-end design and choose frameworks
'''-Week 5''' <br />
:App v2.0 with design fully implemented and tested
:Further training of CNN
:Working database classification (basic)
:Casing (with studio lighting LED shining on fruit)
 
===Meeting 12-05===
During this meeting we worked and talked about various subjects:
'''App'''
:-Screenflow
::Raomi made some first versions of the screenflow showing the functionality of the app and which screen should follow which.
:-Powerpoint
::Yannick made and showed a first version of the app made in Powerpoint, this followed the screenflows made earlier but also gave a first look and feel of the lay-out of the app and which button should be put where. It gave the opportunity to test and see whether some things should change or not.
:-"Real" app
::Birgit downloaded the files needed to make an app and started getting the required knowledge by following tutorials.<br \>
'''Neural Networks'''
:Mark made an overview of different neural networks that exist and of all gave some upsides and downsides. He made this all into a word document which was posted on the Google Drive
 
==Week 5==
===Meeting 19-05===
This week we did not have that much to talk and work on beacuse we did not receive any feedback due to Whit monday. Therefore we mostly continued doing what we were doing and what we had planned. <br  \>During this meeting we worked and talked about various subjects:<br \>
'''App'''
:-"Real" app
::Birgit made a first version of the app based on the screenflows and Powerpoint version which was made earlier. This gave an even more complete overview and look and feel of the app.<br \>
'''Computer Vision'''<br\ >
:During this week we mostly worked on getting the Xbox kinect to take pictures and training of the CNN
 
==Week 6==
===Meeting 23-05===
 
During this week we only had one meeting since half of our team (Maarten, Mark and Cameron) went to the picknpack conference on the 26th. <br \>
 
'''App'''<br \>
On monday we decided to change the functionality of our app to include the tinder-like swiping feature in order for the farmer to be able to quickly rate pictures of strawberries. This was then also made in the powerpoint version of the app, after which Birgit worked on incorporating it in the "real" version of the app.<br \> <br \>
'''Computer vision'''<br \>
More work was done on creating the convolutional neural network used for classifying the strawberries.
 
==Week 7==
===Meeting 30-05===
During this meeting we once again worked on many different point in parallel, which caused us to have much progress, both during this meeting and the meeting after.<br \>
'''App'''<br \>
:-Powerpoint<br \>
::During this meeting we made a third version of the app, where the user can see the specifications of his/her farm per region instead of a total view.<br \>
<br\ >
 
===Meeting 02-06===
This meeting was mostly used to create different buttons for the app and handling the feedback given by the farmer. We used this to enlarge our system towards a higher level design of a farm which uses our system. Furthermore we also worked and wrote about the societal problems that arise and which our system can solve. Also a fourth and fifth version of the app were made.
 
==Week 8==
===Meeting 06-06===
This meeting was used to decide what still needs to be done regarding the final presentation. We made some final adjustments to the app and tested everything.
===Meeting 09-06===
This meeting was mostly used to create the final presentation and to finish the database and vision system which were needed for the final presentation
 
==Week 9==
===Meeting 16-06===
During this meeting we solely worked on improving and finishing the wiki page. We used the rubric and the pages of other groups as inspiration and guidelines

Latest revision as of 15:07, 16 June 2016

Minutes

Week 1

Meeting 20-04

Goal: Create a demo in which a robotic system is able to detect ripe strawberries and harvest them effectively.

Subsystems: Robotic arm Machine Vision/Learning

Requirements for presentation: -USE needs (Yannick + Raomi Team Awesome) -Scientific literature -> EU projects -> How to go beyond? (Cameron) -Requirements --Moving to A to B along a fixed axis --Cutting fruit --Sensing for ripeness --Ambient sensing --Locating fruit --Collecting/Handling fruit effectively --Feedback from farmer to system

-Idea/Solution to the problem

--Moving to A to B along a fixed axis Fixed railing Cable Treads for conveying 4-wheel/2-wheel drive Yo-yo

--Cutting fruit Scissors Custom cutting mechanism Laser cutting Sharp knife (fruit ninja)


--Sensing for ripeness Kinect Color sensor pH sensor Force sensor Machine learning ripeness (with Kinect using training set of ripe fruit)

--Ambient sensing Temperature sensor Humidity Air pressure Sunlight exposure


--Locating fruit Kinect for depth map Probe for fruit (use color sensor to identify where fruit is located) Touch sense for fruit (and then use color sensor) Fixed location for fruit

--Collecting/Handling fruit effectively Basket collection Soft packaging Grip from stem

--Feedback from farmer to system

-Plan of approach Divide task into sub-groups: -Building the robotic arm (mechanical) (TB) -Machine Vision/Learning (software) (TC) -Control of the robotic arm (software and electronics) (TBC) -Feedback from farmer (software) (TA) -USE aspects (TA)

Key: TA: Yannick & Raomi TB: Mark & Maarten TC: Cameron and Birgit

Deadline Week1: Defining project plan and timeline Specify USE aspects and identify multiple solutions Elucidate requirements Compile Literature Create a presentation

Deadline Week2: First presentation Parts list (BOM) Order parts Begin drawings/concepts

Deadline Week3: Beginning of meetings Mark abandons us Consolidate drawings

Deadline Week4: Separately working components Working base for movement from A to B

Deadline Week5: Build the arm

Deadline Week6: Control of arm

Deadline Week7: Integration of subsystems

Deadline Week8: Testing of system Deadline Week9: Phase-out period

Week 2

Meeting 25-04

Idea in steps:


Problem description:

Less people needed to work on farms, they can work in other fields (like help elderly)

Facts and figures!

Move away from 3 group structure, no
The whole group working on the use aspect first?
The groups should not be too large,

We should first redesign the whole system with the focus of the way the USE aspects interact and change the design (as seen from a purely technical perspective)

We should specify more requirements with respect to the USE aspects, not as much from the technical side.

Users:
-Primary:

-Farmers:
-Higher efficiency
-More harvest
-Lower costs due to less employees
-They will have more time
-Less physical work / Health benefits
-Don’t have to train seasonal workers

-Secondary:

-Supermarkets, Distributors:
-Higher quality food
-Lower cost for food
-There will be more fresh food available(maybe?)
-More efficient supply chain
-The supermarkets might also be able to give feedback.

-Tertiary:

-Company that creates the robots and maintains them:

-

Society

-There will not be enough food in the near future for all the people.
-Not enough workers, due to aging.
-Decrease in wage gap due to overabundance of food
-Post scarcity


Enterprise:

-Large farms will start to dominate the market which will result in an increase in the gap between rich and poor.
-Lower worker costs
-Lower food prices might result in more sales

Results from USE

-The supermarket might want to have a say in which fruits are ripe.
-Direct and faster communication
-We can better specify the amount of food that we need.
-There is a lot of food waste, even due to food not looking good.
-Robots can determine what the best use for fruits is based on many factors (e.g. looks)
-Farmer feedback:
-Via his/her smartphone.
-Online database for machine learning
-Many farmers can have access -> lots of pictures means high accuracy
-Farmers and secondary users should have a separate application
-Farmer should have more control (e.g. stop button)
-Greenhouses
-Robots save space
-Verticality
-More plants
-Can work 24/7

Society idea:

-every buyer: can indicate which fruit it wants and how many at which time.
-every farmer: can indicate which fruits it can deliver at what time.

Raomi: Society part (needs with h2020)
Birgit: identify supply chain
Cameron: technical plan
Yannick: planning
Maarten: User part
Mark: Enterprise part

Meeting 28-04

How it's done right now (researched by Birgit)
Supply chain state of the art: what they do now is sell not-nice-looking food to farmers for animals (not thrown away). There exists a machine that sorts potatoes on size, ugliness, etc., but probably not findable online (no research done on it).

User part: not done, put on Drive AP.

Society: found on H2020 (expand..), no figures and facts there.

Enterprise: not done, put on Drive AP.

Idea:
- 1 system with Kinect, Raspberry Pi, … to classify fruit. Connects to database for classifying fruit.
- Encasing with custom lighting for equal light situations.
- Cloud database for the pictures with machine learning.

Problem: images taken by different types of camera's with different color. Possible solution: one type of tablet. Other possible solution: only use Kinect for sensing (something else for interface). Thus we have only 1 system. This solution is chosen.

Teams:
App team: UI/UX, app, interface, Raomi, Birgit, Yannick.
Database/backend/Kinect: machine learning, neural networks. Cameron, Maarten, Mark.

Ethical part
- How broad, enterprise scale/impact analyzation? AP ask - During the project make sure that the user is taken into consideration (in design).

[1]

Week 3

Meeting 02-05

Focus Isolated fruit detection in controlled lighting

Tasks
Learning about CNN and basics, implement a non-fruit NN (look into TensorFlow) - MV, CW, MdJ
Read literature about previously implemented CNNs - MV, CW, MdJ
Basic CNN, demonstrate fruit detection- CW
Create a basic API to POST images and catch the response - MV
Screenflow for App - Raomi
USE aspects of App, why does app help the solution? - YA, RvR
Basic visuals of app with walkthrough - BvdS, YA
Look into Mendix - BvdS

Find a U, S, and E to interview about automation in their process/harvesting as well as our solution and how it impacts their sector. -MV ,U(E), BvdS (U,S)

Week 4

Meeting 09-05

Updated Planning:
-Before thursday 12-05

Maarten: Find local farmer
How we will define color?
Which lighting conditions do we want?
Mark: Why CNN? (compared with for example support vector machine)
Yannick: Powerpoint of app
Birgit: First version of the app
LED light ring

-Before Thursday 19-05

Cameron, Maarten, Mark: Gathering database with rasberry pi and kinect
Farmer: [2]
Farmer: Needed to assure the use of our product

-Week 3

Preliminary design for app
First implementation of app
Database/Server setup
CNN, and basics of neural networks

-Week 4

App v1.0 with design fully implemented
Kinect interfacing to Raspberry Pi completed
USEing intensifies
Finish back-end design and choose frameworks

-Week 5

App v2.0 with design fully implemented and tested
Further training of CNN
Working database classification (basic)
Casing (with studio lighting LED shining on fruit)

Meeting 12-05

During this meeting we worked and talked about various subjects: App

-Screenflow
Raomi made some first versions of the screenflow showing the functionality of the app and which screen should follow which.
-Powerpoint
Yannick made and showed a first version of the app made in Powerpoint, this followed the screenflows made earlier but also gave a first look and feel of the lay-out of the app and which button should be put where. It gave the opportunity to test and see whether some things should change or not.
-"Real" app
Birgit downloaded the files needed to make an app and started getting the required knowledge by following tutorials.

Neural Networks

Mark made an overview of different neural networks that exist and of all gave some upsides and downsides. He made this all into a word document which was posted on the Google Drive

Week 5

Meeting 19-05

This week we did not have that much to talk and work on beacuse we did not receive any feedback due to Whit monday. Therefore we mostly continued doing what we were doing and what we had planned.
During this meeting we worked and talked about various subjects:
App

-"Real" app
Birgit made a first version of the app based on the screenflows and Powerpoint version which was made earlier. This gave an even more complete overview and look and feel of the app.

Computer Vision<br\ >

During this week we mostly worked on getting the Xbox kinect to take pictures and training of the CNN

Week 6

Meeting 23-05

During this week we only had one meeting since half of our team (Maarten, Mark and Cameron) went to the picknpack conference on the 26th.

App
On monday we decided to change the functionality of our app to include the tinder-like swiping feature in order for the farmer to be able to quickly rate pictures of strawberries. This was then also made in the powerpoint version of the app, after which Birgit worked on incorporating it in the "real" version of the app.

Computer vision
More work was done on creating the convolutional neural network used for classifying the strawberries.

Week 7

Meeting 30-05

During this meeting we once again worked on many different point in parallel, which caused us to have much progress, both during this meeting and the meeting after.
App

-Powerpoint
During this meeting we made a third version of the app, where the user can see the specifications of his/her farm per region instead of a total view.

<br\ >

Meeting 02-06

This meeting was mostly used to create different buttons for the app and handling the feedback given by the farmer. We used this to enlarge our system towards a higher level design of a farm which uses our system. Furthermore we also worked and wrote about the societal problems that arise and which our system can solve. Also a fourth and fifth version of the app were made.

Week 8

Meeting 06-06

This meeting was used to decide what still needs to be done regarding the final presentation. We made some final adjustments to the app and tested everything.

Meeting 09-06

This meeting was mostly used to create the final presentation and to finish the database and vision system which were needed for the final presentation

Week 9

Meeting 16-06

During this meeting we solely worked on improving and finishing the wiki page. We used the rubric and the pages of other groups as inspiration and guidelines