CN105243270A - Diet monitoring method, apparatus and system and catering furniture - Google Patents

Diet monitoring method, apparatus and system and catering furniture Download PDF

Info

Publication number
CN105243270A
CN105243270A CN201510612477.XA CN201510612477A CN105243270A CN 105243270 A CN105243270 A CN 105243270A CN 201510612477 A CN201510612477 A CN 201510612477A CN 105243270 A CN105243270 A CN 105243270A
Authority
CN
China
Prior art keywords
weighing
moment
dining table
subimage
weight
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510612477.XA
Other languages
Chinese (zh)
Other versions
CN105243270B (en
Inventor
王百超
陈志军
侯文迪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Technology Co Ltd
Xiaomi Inc
Original Assignee
Xiaomi Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiaomi Inc filed Critical Xiaomi Inc
Priority to CN201510612477.XA priority Critical patent/CN105243270B/en
Publication of CN105243270A publication Critical patent/CN105243270A/en
Application granted granted Critical
Publication of CN105243270B publication Critical patent/CN105243270B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The present disclosure relates to a diet monitoring method, apparatus and system and catering furniture, and belongs to the field of electronic technology applications. The method comprises: comparing whether a dinning table surface image at dinning time is different from an original dinning table surface image, wherein the original dinning table surface image is a table surface image of a dinning table on which no articles are placed; when the dinning table surface image is different from the original dinning table surface image, obtaining a discriminative image, wherein the discriminative image comprises N sub-images, and the area of any sub-image is smaller than the area of the original image, and N is an integer that is greater than or equal to 1; determining a diet label of each sub-image in the N sub-images; and sending catering information, wherein the catering information comprises diet labels of the N sub-images. According to the diet monitoring method, apparatus and system and the catering furniture provided by the present disclosure, the steps of diet monitoring are simplified, thereby improving the efficiency of diet monitoring. The the diet monitoring method, apparatus and system and the catering furniture provided by the present disclosure are used for diet monitoring.

Description

Diet method for supervising, device, system and food and drink furniture
Technical field
The disclosure relates to application of electronic technology field, particularly a kind of diet method for supervising, device, system and food and drink furniture.
Background technology
Along with improving constantly of people's living standard, the degree of concern of people to personal health is also more and more higher.The various health diet class application program for diet record and health control emerges in an endless stream.
In correlation technique, user is when using the application program of health diet class, can by the physical condition of self (as height, body weight etc.) and the diet record of every day input in this application program, application program can according to the physical condition of user, for user provides rational dietary recommendation.
Summary of the invention
In order to solve the problem in correlation technique, present disclose provides a kind of diet method for supervising, device, system and food and drink furniture.Described technical scheme is as follows:
According to the first aspect of disclosure embodiment, provide a kind of diet method for supervising, the method comprises:
Whether the dining table top image and the dining table top original image that compare the dining moment be distinct, desktop picture when this dining table top original image is non-placing articles on this dining table;
This dining table top image and dining table top original image distinct time, obtain difference image, this difference image comprises: N number of subimage, and the area of this subimage arbitrary is less than the area of this original image, this N be greater than or equal to 1 integer;
Determine the diet label of each subimage in this N number of subimage;
Send catering information, this catering information comprises the diet label of this N number of subimage.
A kind of diet method for supervising that disclosure embodiment provides, camera can compare the dining dining table top image in moment and dining table top original image whether distinct, and this dining table top image and dining table top original image distinct time, obtain difference image, this difference image comprises: N number of subimage, camera can determine the diet label of each subimage in this N number of subimage further afterwards, and sending catering information to terminal, this catering information comprises the diet label of this N number of subimage.Therefore, user, without the need to manually to terminal input diet record, simplifies the operating process of diet monitoring, improves the efficiency of diet monitoring.
Optionally, the method also comprises:
Take preset duration as one-period length, periodically gather dining table top image;
When the area that the current dining table top image and upper gathering the moment gathers the difference image of the dining table top image in moment is greater than this preset area threshold value, determine that this current collection moment is this dining moment.
A kind of diet method for supervising that disclosure embodiment provides, camera by the dining table top image in contrast different acquisition moment, can determine the moment of having dinner, and improves the accuracy of diet monitoring.
Optionally, the method also comprises:
When dining table top image and upper area gathering the difference image of the dining table top image in moment gathering the moment is not more than this preset area threshold value on this, determine that this current collection moment is start time of having dinner.
A kind of diet method for supervising that disclosure embodiment provides, camera by the dining table top image in contrast different acquisition moment, can determine start time of having dinner, and improves the accuracy of diet monitoring.
Optionally, the method also comprises:
When the area that the dining table top image and upper in this current collection moment gathers the difference image of the dining table top image in moment is not more than this preset area threshold value, judge on this, whether a collection moment is the moment of having dinner;
When a collection moment is for the dining moment on this, determine that this current collection moment is for dining finish time.
A kind of diet method for supervising that disclosure embodiment provides, camera by the dining table top image in contrast different acquisition moment, can determine finish time of having dinner, and improves the accuracy of diet monitoring.
Optionally, the method also comprises:
In this dining start time and this dining finish time, send instruction of weighing respectively, this instruction of weighing is used to indicate food and drink furniture and obtains weight of object on this food and drink furniture, and this weight of object is sent to this terminal, this food and drink furniture comprises at least one in dining table and chair, and this chair is positioned at the predeterminable range scope around this dining table.
A kind of diet method for supervising that disclosure embodiment provides, camera can send to food and drink furniture instruction of weighing, so that food and drink furniture obtains the weight of object on this food and drink furniture, and this weight of object is sent to this terminal, and then improve the accuracy of diet monitoring.
Optionally, this instruction of weighing also comprises moment mark, and it is this dining start time or this dining finish time that this moment mark is used to indicate the moment sending instruction of weighing.
A kind of diet method for supervising that disclosure embodiment provides, food and drink furniture can generate weighing information according to this moment mark, and then improves the accuracy of diet monitoring.
Optionally, the method also comprises:
Obtain the Region dividing of the desktop of dining table in this dining table top original image pre-set;
According to this Region dividing, determine each subimage region in this dining table top original image in this N number of subimage;
Obtain the Region dividing of this dining table top original image pre-set and numbering corresponding relation;
According to this corresponding relation, determine the numbering of this each subimage region in this dining table top original image;
Generate this catering information, this catering information at least comprises: the numbering of this each subimage region in this dining table top original image.
A kind of diet method for supervising that disclosure embodiment provides, camera comprises the numbering of each subimage region in this dining table top original image to the catering information that terminal sends, so that each subimage mates with corresponding weighing information according to this zone number by terminal, and then improve the accuracy of diet monitoring.
Optionally, this catering information also comprises: this N number of subimage, the diet label one_to_one corresponding of this N number of subimage and this N number of subimage.
A kind of diet method for supervising that disclosure embodiment provides, terminal can the diet label of the N number of subimage of corresponding display and this N number of subimage, and then the intuitive of terminal demonstration catering information.
According to the second aspect of disclosure embodiment, provide a kind of diet method for supervising, the method comprises:
Reception is weighed instruction, and this instruction of weighing is sent in dining start time or dining finish time by camera;
According to this instruction of weighing, Weighing module is adopted to obtain the weight of the object on current time food and drink furniture;
Send weighing information, this weighing information comprises: the weight of object on this food and drink furniture of current time and the mark of this food and drink furniture.
A kind of diet method for supervising that disclosure embodiment provides, after food and drink furniture receives the instruction of weighing of camera transmission, Weighing module can be adopted to obtain the weight of the object on current time food and drink furniture, and sending weighing information to terminal, this weighing information comprises: the weight of object on this food and drink furniture of current time and the mark of this food and drink furniture.Therefore the diet of weighing information to user that terminal can provide according to food and drink furniture is monitored, and improves the accuracy of diet monitoring.
Optionally, when this food and drink furniture is dining table, the desktop of this dining table is divided into M region, and this Weighing module comprises M sub-Weighing module, and the individual sub-Weighing module of this M respectively one_to_one corresponding is arranged on below this M region,
This is according to this instruction of weighing, and adopts Weighing module to obtain the weight of the object on current time food and drink furniture, comprising:
In this M sub-Weighing module, determine to perceive the sub-Weighing module of target of weight change;
Obtain the weight that the sub-Weighing module of this target each measures.
A kind of diet method for supervising that disclosure embodiment provides, is provided with M sub-Weighing module in dining table, dining table can obtain in this M sub-Weighing module, and the weight that the sub-Weighing module of each target measures, therefore improves the accuracy of diet monitoring.
Optionally, the method also comprises:
According to the Region dividing of the desktop of this dining table pre-set, determine the sub-Weighing module region of this target;
Obtain the Region dividing of this dining table top original image pre-set and numbering corresponding relation;
According to this corresponding relation, determine the numbering of the sub-Weighing module region of this target each;
Generate this weighing information, this weighing information at least comprises: the numbering of the sub-Weighing module region of this target each, the weight measured with the sub-Weighing module of this target each.
A kind of diet method for supervising that disclosure embodiment provides, the numbering of the sub-Weighing module region of target is also comprised in the weighing information that dining table generates, so that catering information according to this zone number, can mate with weighing information by terminal, improve the accuracy of diet monitoring.
Optionally, when this food and drink furniture is chair,
This is according to this instruction of weighing, and adopts Weighing module to obtain the weight of food and drink furniture current time placement object, comprising:
According to this instruction of weighing, the Weighing module below the seat being arranged on this chair is adopted to obtain the weight of the human body that this chair current time bears.
A kind of diet method for supervising that disclosure embodiment provides, chair can by the Weighing module be arranged on below seat obtain respectively user have dinner before and after body weight, improve diet monitoring accuracy.
Optionally, this instruction of weighing also comprises moment mark, and it is this dining start time or this dining finish time that this moment mark is used to indicate the moment sending instruction of weighing;
This weighing information also comprises: this moment identifies.
A kind of diet method for supervising that disclosure embodiment provides, terminal can be analyzed this weighing information according to the moment mark in weighing information, and then improves the accuracy of diet monitoring.
According to the third aspect of disclosure embodiment, provide a kind of diet method for supervising, the method comprises:
Receive catering information, this catering information comprises the diet label of N number of subimage;
Show this catering information.
A kind of diet method for supervising that disclosure embodiment provides, terminal can show diet record after receiving the catering information of camera transmission intuitively, improves the intuitive of diet monitoring.
Optionally, the method also comprises:
Receive weighing information, this weighing information comprises: the weight of object on this food and drink furniture of current time and the mark of this food and drink furniture, and this weight is that this food and drink furniture adopts Weighing module to obtain;
Show this weighing information.
A kind of diet method for supervising that disclosure embodiment provides, terminal can show this weighing information after receiving the weighing information of food and drink furniture transmission intuitively, improves the intuitive of diet monitoring.
Optionally, this catering information also comprises:
This N number of subimage, the diet label one_to_one corresponding of this N number of subimage and this N number of subimage;
This catering information of this display, comprising:
The diet label one_to_one corresponding of this N number of subimage and this N number of subimage is shown.
A kind of diet method for supervising that disclosure embodiment provides, the diet label one_to_one corresponding of N number of subimage and this N number of subimage can show by terminal, improves accuracy and the intuitive of diet monitoring.
Optionally, this food and drink furniture comprises at least one in dining table and chair, when this food and drink furniture is dining table, the desktop of this dining table is divided into M region, this Weighing module comprises M sub-Weighing module, and the individual sub-Weighing module of this M respectively one_to_one corresponding is arranged on below this M region
This catering information at least comprises: the numbering of each subimage region in this dining table top original image in this N number of subimage, image when this dining table top original image is non-placing articles on this dining table, this weighing information at least comprises: the numbering of the sub-Weighing module region of each target, the weight measured with the sub-Weighing module of this target each, the sub-Weighing module of this target is the sub-Weighing module perceiving weight change in this M sub-Weighing module;
This weighing information of this display, comprising:
The numbering of this each target sub-Weighing module region is mated with the numbering of each subimage region in this dining table top original image in this N number of subimage, determines the target designation that the numbering of region in this dining table top original image in this N number of subimage is identical with the numbering of this target sub-Weighing module region;
The weight that the sub-Weighing module of target that the subimage that this target designation of association display is corresponding is corresponding with this target designation measures.
A kind of diet method for supervising that disclosure embodiment provides, terminal can associate the weight that subimage corresponding to the display-object numbering target sub-Weighing module corresponding with this target designation measures, and improves accuracy and the intuitive of diet monitoring.
Optionally, this weighing information also comprises: the moment identifies, and this moment mark is used to indicate camera and sends the moment of instruction of weighing to this food and drink furniture for dining start time or dining finish time, and the method also comprises:
In the weighing information received, by the moment of moment mark instruction be dining start time this food and drink furniture of current time of comprising of weighing information on the weight of object be defined as the first weight;
In the weighing information received, by the moment of moment mark instruction be dining finish time this food and drink furniture of current time of comprising of weighing information on the weight of object be defined as the second weight;
The difference of this first weight and this second weight is defined as intake;
According to this intake, generate dietary recommendation information;
Show this dietary recommendation information.
A kind of diet method for supervising that disclosure embodiment provides, terminal according to the catering information received and weighing information, can also generate and show dietary recommendation information, having enriched the displaying contents of terminal.
Optionally, this weighing information also comprises: the moment identifies, and this moment mark is used to indicate camera and sends the moment of instruction of weighing to this food and drink furniture for dining start time or dining finish time, and the method also comprises:
In the weighing information received, by the moment of moment mark instruction be dining start time this food and drink furniture of current time of comprising of weighing information on the weight of object be defined as the first weight;
In the weighing information received, by the moment of moment mark instruction be dining finish time this food and drink furniture of current time of comprising of weighing information on the weight of object be defined as the second weight;
Obtain in this first weight, the first sub-weight that the sub-Weighing module of the target that this target designation is corresponding measures;
Obtain in this second weight, the second sub-weight that the sub-Weighing module of the target that this target designation is corresponding measures;
The difference of this first sub-weight and this second sub-weight is defined as target intake;
According to the diet label of this target intake subimage corresponding with this target designation, generate dietary recommendation information;
Show this dietary recommendation information.
A kind of diet method for supervising that disclosure embodiment provides, terminal according to the catering information received and weighing information, can also generate and show dietary recommendation information, having enriched the displaying contents of terminal, improves the accuracy of diet monitoring.
According to the fourth aspect of disclosure embodiment, provide a kind of diet supervising device, this device comprises:
Comparison module, whether the dining table top image and the dining table top original image that are configured to compare the dining moment be distinct, desktop picture when this dining table top original image is non-placing articles on this dining table;
First acquisition module, be configured to this dining table top image and dining table top original image distinct time, obtain difference image, this difference image comprises: N number of subimage, the area of this subimage arbitrary is less than the area of this original image, this N be greater than or equal to 1 integer;
First determination module, is configured to the diet label determining each subimage in this N number of subimage;
First sending module, be configured to send catering information, this catering information comprises the diet label of this N number of subimage.
Optionally, this device also comprises:
Acquisition module, is configured to take preset duration as one-period length, periodically gathers dining table top image;
Second determination module, be configured to current collection the moment dining table top image with on an area gathering the difference image of the dining table top image in moment be greater than this preset area threshold value time, determine that this current collection moment is this dining moment.
Optionally, this device also comprises:
3rd determination module, to be configured on this dining table top image gathering the moment with on gather the difference image of the dining table top image in moment area be not more than this preset area threshold value time, determine that this current collections moment is for having dinner start time.
Optionally, this device also comprises:
First judge module, when the area that the dining table top image and upper being configured to gather in this prior the moment gathers the difference image of the dining table top image in moment is not more than this preset area threshold value, to judge on this that one gathers whether the moment is the dining moment;
4th determination module, being configured to collection moment on this is when having dinner the moment, determines that this current collection moment is for dining finish time.
Optionally, this device also comprises:
Second sending module, be configured in this dining start time and this dining finish time, send instruction of weighing respectively, this instruction of weighing is used to indicate food and drink furniture and obtains weight of object on this food and drink furniture, and this weight of object is sent to this terminal, this food and drink furniture comprises at least one in dining table and chair, and this chair is positioned at the predeterminable range scope around this dining table.
Optionally, this instruction of weighing also comprises moment mark, and it is this dining start time or this dining finish time that this moment mark is used to indicate the moment sending instruction of weighing.
Optionally, this device also comprises:
Second acquisition module, is configured to obtain the Region dividing of the desktop of dining table in this dining table top original image pre-set;
5th determination module, is configured to according to this Region dividing, determines each subimage region in this dining table top original image in this N number of subimage;
3rd acquisition module, is configured to obtain the Region dividing of this dining table top original image pre-set and numbering corresponding relation;
6th determination module, is configured to according to this corresponding relation, determines the numbering of this each subimage region in this dining table top original image;
Generation module, be configured to generate this catering information, this catering information at least comprises: the numbering of this each subimage region in this dining table top original image.
Optionally, this catering information also comprises: this N number of subimage, the diet label one_to_one corresponding of this N number of subimage and this N number of subimage.
According to the 5th aspect of disclosure embodiment, provide a kind of food and drink furniture, this food and drink furniture comprises at least one in dining table and chair,
Weighing module and communication module is provided with in this food and drink furniture;
This Weighing module is connected with this communication module.
Optionally, when this food and drink furniture is dining table, the desktop of this dining table is divided into M region, and this Weighing module comprises M sub-Weighing module, and the individual sub-Weighing module of this M respectively one_to_one corresponding is arranged on below this M region;
This M sub-Weighing module is connected with this communication module respectively.
Optionally, when this food and drink furniture is chair, this Weighing module is arranged on the below of this seat.
According to the 6th aspect of disclosure embodiment, provide a kind of diet supervising device, this device comprises:
First receiver module, be configured to receive catering information, this catering information comprises the diet label of N number of subimage;
First display module, is configured to show this catering information.
Optionally, this device also comprises:
Second receiver module, be configured to receive weighing information, this weighing information comprises: the weight of object on this food and drink furniture of current time and the mark of this food and drink furniture, and this weight is that this food and drink furniture adopts Weighing module to obtain;
Second display module, is configured to show this weighing information.
Optionally, this catering information also comprises:
This N number of subimage, the diet label one_to_one corresponding of this N number of subimage and this N number of subimage;
This first display module, is configured to:
The diet label one_to_one corresponding of this N number of subimage and this N number of subimage is shown.
Optionally, this food and drink furniture comprises at least one in dining table and chair, when this food and drink furniture is dining table, the desktop of this dining table is divided into M region, this Weighing module comprises M sub-Weighing module, and the individual sub-Weighing module of this M respectively one_to_one corresponding is arranged on below this M region
This catering information at least comprises: the numbering of each subimage region in this dining table top original image in this N number of subimage, image when this dining table top original image is non-placing articles on this dining table, this weighing information at least comprises: the numbering of the sub-Weighing module region of each target, the weight measured with the sub-Weighing module of this target each, the sub-Weighing module of this target is the sub-Weighing module perceiving weight change in this M sub-Weighing module;
This second display module, is configured to:
The numbering of this each target sub-Weighing module region is mated with the numbering of each subimage region in this dining table top original image in this N number of subimage, determines the target designation that the numbering of region in this dining table top original image in this N number of subimage is identical with the numbering of this target sub-Weighing module region;
The weight that the sub-Weighing module of target that the subimage that this target designation of association display is corresponding is corresponding with this target designation measures.
Optionally, this weighing information also comprises: the moment identifies, and this moment mark is used to indicate camera and sends the moment of instruction of weighing to this food and drink furniture for dining start time or dining finish time, and this device also comprises:
First determination module, is configured in the weighing information received, by the moment of moment mark instruction be dining start time this food and drink furniture of current time of comprising of weighing information on the weight of object be defined as the first weight;
Second determination module, is configured in the weighing information received, by the moment of moment mark instruction be dining finish time this food and drink furniture of current time of comprising of weighing information on the weight of object be defined as the second weight;
3rd determination module, is configured to the difference of this first weight and this second weight to be defined as intake;
First generation module, is configured to according to this intake, generates dietary recommendation information;
3rd display module, is configured to show this dietary recommendation information.
Optionally, this weighing information also comprises: the moment identifies, and this moment mark is used to indicate camera and sends the moment of instruction of weighing to this food and drink furniture for dining start time or dining finish time, and this device also comprises:
4th determination module, is configured in the weighing information received, by the moment of moment mark instruction be dining start time this food and drink furniture of current time of comprising of weighing information on the weight of object be defined as the first weight;
5th determination module, is configured in the weighing information received, by the moment of moment mark instruction be dining finish time this food and drink furniture of current time of comprising of weighing information on the weight of object be defined as the second weight;
First acquisition module, is configured to obtain in this first weight, the first sub-weight that the sub-Weighing module of the target that this target designation is corresponding measures;
Second acquisition module, is configured to obtain in this second weight, the second sub-weight that the sub-Weighing module of the target that this target designation is corresponding measures;
6th determination module, is configured to the difference of this first sub-weight and this second sub-weight to be defined as target intake;
Second generation module, is configured to the diet label according to this target intake subimage corresponding with this target designation, generates dietary recommendation information;
4th display module, is configured to show this dietary recommendation information.
According to the 7th aspect of disclosure embodiment, provide a kind of diet supervising device, this device comprises:
Processor;
For storing the storer of the executable instruction of this processor;
Wherein, this processor is configured to:
Whether the dining table top image and the dining table top original image that compare the dining moment be distinct, image when this dining table top original image is non-placing articles on this dining table;
This dining table top image and dining table top original image distinct time, obtain difference image, this difference image comprises: N number of subimage, and the area of this subimage arbitrary is less than the area of this original image, this N be greater than or equal to 1 integer;
Determine the diet label of each subimage in this N number of subimage;
Send catering information, this catering information comprises the diet label of this N number of subimage.
According to the eighth aspect of disclosure embodiment, provide a kind of diet supervising device, this device comprises:
Processor;
For storing the storer of the executable instruction of this processor;
Wherein, this processor is configured to:
Receive catering information, this catering information comprises the diet label of N number of subimage;
Show this catering information.
According to the 9th aspect of disclosure embodiment, provide a kind of diet supervisory system, this system comprises: camera and terminal,
This camera comprises the arbitrary described diet supervising device of fourth aspect;
This terminal comprises the arbitrary described diet supervising device in the 6th aspect.
Optionally, this system also comprises: the arbitrary described food and drink furniture in the 5th aspect.
According to the tenth aspect of disclosure embodiment, provide a kind of diet supervisory system, this system comprises: camera and terminal,
This camera comprises the diet supervising device described in the 7th aspect;
This terminal comprises the diet supervising device described in eighth aspect.
Optionally, this system also comprises: the arbitrary described food and drink furniture in the 5th aspect.
The technical scheme that embodiment of the present disclosure provides can comprise following beneficial effect:
A kind of diet method for supervising that disclosure embodiment provides, device, system and food and drink furniture, camera can compare the dining dining table top image in moment and dining table top original image whether distinct, and this dining table top image and dining table top original image distinct time, obtain difference image, this difference image comprises: N number of subimage, camera can determine the diet label of each subimage in this N number of subimage further afterwards, and send catering information to terminal, and can send to food and drink furniture instruction of weighing in dining beginning and finish time, after food and drink furniture receives the instruction of weighing of camera transmission, Weighing module can be adopted to obtain the weight of the object on current time food and drink furniture, and send weighing information to terminal.The catering information that terminal can send according to camera and the weighing information that food and drink furniture sends generate dietary recommendation information.Therefore, user, without the need to manually to terminal input diet record, simplifies the operating process of diet monitoring, improves efficiency and the accuracy of diet monitoring.
Should be understood that, it is only exemplary that above general description and details hereinafter describe, and can not limit the disclosure.
Accompanying drawing explanation
In order to be illustrated more clearly in embodiment of the present disclosure, below the accompanying drawing used required in describing embodiment is briefly described, apparently, accompanying drawing in the following describes is only embodiments more of the present disclosure, for those of ordinary skill in the art, under the prerequisite not paying creative work, other accompanying drawing can also be obtained according to these accompanying drawings.
The schematic diagram of the implementation environment a kind of diet method for supervising involved by of Fig. 1 according to an exemplary embodiment;
The process flow diagram of a kind of diet method for supervising of Fig. 2 according to an exemplary embodiment;
The process flow diagram of the another kind of diet method for supervising of Fig. 3 according to an exemplary embodiment;
The process flow diagram of another the diet method for supervising of Fig. 4 according to an exemplary embodiment;
The process flow diagram of another the diet method for supervising of Fig. 5-1 according to an exemplary embodiment;
A kind of camera of Fig. 5-2 according to an exemplary embodiment determines the moment of having dinner, the method flow diagram of dining start time and dining finish time;
The schematic diagram of a kind of difference image of Fig. 5-3 according to an exemplary embodiment;
A kind of camera of Fig. 5-4 according to an exemplary embodiment generates the method flow diagram of catering information;
The Region dividing schematic diagram of the desktop of a kind of dining table of Fig. 5-5 according to an exemplary embodiment;
The schematic diagram of a kind of terminal demonstration this catering information of Fig. 5-6 according to an exemplary embodiment;
A kind of food and drink furniture of Fig. 5-7 according to an exemplary embodiment obtains the method flow diagram of the weight of object;
A kind of dining table of Fig. 5-8 according to an exemplary embodiment generates the method flow diagram of weighing information;
The schematic diagram of a kind of terminal demonstration weighing information of Fig. 5-9 according to an exemplary embodiment;
Fig. 5-10 is method flow diagrams of a kind of terminal demonstration weighing information according to an exemplary embodiment;
Fig. 5-11 is schematic diagram of the another kind of terminal demonstration weighing information according to an exemplary embodiment;
Fig. 5-12 is schematic diagram that terminal according to an exemplary embodiment receives the moment of weighing information;
Fig. 5-13 is schematic diagram of a kind of terminal demonstration dietary recommendation information according to an exemplary embodiment;
Fig. 5-14 is method flow diagrams of a kind of terminal generation dietary recommendation information according to an exemplary embodiment;
Fig. 5-15 is schematic diagram of the another kind of terminal demonstration dietary recommendation information according to an exemplary embodiment;
Fig. 6-1 is the block diagram of a kind of diet supervising device according to an exemplary embodiment;
Fig. 6-2 is block diagrams of the another kind of diet supervising device according to an exemplary embodiment;
Fig. 7-1 is the block diagram of a kind of food and drink furniture according to an exemplary embodiment;
Fig. 7-2 is schematic diagram of a kind of dining table according to an exemplary embodiment;
Fig. 7-3 is schematic diagram of a kind of chair according to an exemplary embodiment;
Fig. 8-1 is the block diagram of another the diet supervising device according to an exemplary embodiment;
Fig. 8-2 is block diagrams of another the diet supervising device according to an exemplary embodiment;
Fig. 9 is the block diagram of another the diet supervising device according to an exemplary embodiment;
Figure 10 is the block diagram of another the diet supervising device according to an exemplary embodiment.
Accompanying drawing to be herein merged in instructions and to form the part of this instructions, shows and meets embodiment of the present disclosure, and is used from instructions one and explains principle of the present disclosure.
Embodiment
In order to make object of the present disclosure, technical scheme and advantage clearly, be described in further detail the disclosure below in conjunction with accompanying drawing, obviously, described embodiment is only a part of embodiment of the disclosure, instead of whole embodiments.Based on the embodiment in the disclosure, those of ordinary skill in the art are not making other embodiments all obtained under creative work prerequisite, all belong to the scope of disclosure protection.
The schematic diagram of the implementation environment a kind of diet method for supervising involved by of Fig. 1 according to an exemplary embodiment.This implementation environment can comprise: camera 01, food and drink furniture 02 and terminal 03, and wherein food and drink furniture 02 can comprise dining table 021 and chair 022.Terminal 03 can be smart mobile phone, computer, multimedia player, electronic reader, Wearable device etc.Wherein camera 01 is arranged on the top of food and drink furniture 02, this camera 01 can be set up by cable network or wireless network with food and drink furniture 02 and terminal 03 respectively and be connected, can by cable network or wireless network foundation connection between food and drink furniture 02 and terminal 03.
Camera 01 is for gathering the dining table top image in food and drink furniture 02, according to the Computer image genration catering information gathered, and this catering information is sent to terminal 03, food and drink furniture 02 is for generating weighing information according to the weight parameter obtained, and this weighing information is sent to terminal 03, the catering information that terminal 03 can send according to camera 01 and the weighing information that food and drink furniture 02 sends, generate dietary recommendation information.
The process flow diagram of a kind of diet method for supervising of Fig. 2 according to an exemplary embodiment, the method can be applied in the camera 01 shown in Fig. 1, and as described in Figure 2, the method comprises:
In step 201, whether the dining table top image and the dining table top original image that compare the dining moment be distinct, desktop picture when this dining table top original image is non-placing articles on this dining table.
In step 202., this dining table top image and dining table top original image distinct time, obtain difference image, this difference image comprises: N number of subimage, the area of this subimage arbitrary is less than the area of this original image, this N be greater than or equal to 1 integer.
In step 203, the diet label of each subimage in this N number of subimage is determined.
In step 204, send catering information, this catering information comprises the diet label of this N number of subimage.
Such as, catering information is sent to terminal.
In sum, a kind of diet method for supervising that disclosure embodiment provides, camera can compare the dining dining table top image in moment and dining table top original image whether distinct, and this dining table top image and dining table top original image distinct time, obtain difference image, this difference image comprises: N number of subimage, camera can determine the diet label of each subimage in this N number of subimage further afterwards, and sending catering information to terminal, this catering information comprises the diet label of this N number of subimage.Therefore, user, without the need to manually to terminal input diet record, simplifies the operating process of diet monitoring, improves the efficiency of diet monitoring.
Optionally, the method also comprises:
Take preset duration as one-period length, periodically gather dining table top image;
When the area that the current dining table top image and upper gathering the moment gathers the difference image of the dining table top image in moment is greater than this preset area threshold value, determine that this current collection moment is this dining moment.
A kind of diet method for supervising that disclosure embodiment provides, camera by the dining table top image in contrast different acquisition moment, can determine the moment of having dinner, and improves the accuracy of diet monitoring.
Optionally, the method also comprises:
When dining table top image and upper area gathering the difference image of the dining table top image in moment gathering the moment is not more than this preset area threshold value on this, determine that this current collection moment is start time of having dinner.
A kind of diet method for supervising that disclosure embodiment provides, camera by the dining table top image in contrast different acquisition moment, can determine start time of having dinner, and improves the accuracy of diet monitoring.
Optionally, the method also comprises:
When the area that the dining table top image and upper in this current collection moment gathers the difference image of the dining table top image in moment is not more than this preset area threshold value, judge on this, whether a collection moment is the moment of having dinner;
When a collection moment is for the dining moment on this, determine that this current collection moment is for dining finish time.
A kind of diet method for supervising that disclosure embodiment provides, camera by the dining table top image in contrast different acquisition moment, can determine finish time of having dinner, and improves the accuracy of diet monitoring.
Optionally, the method also comprises:
In this dining start time and this dining finish time, send instruction of weighing respectively, this instruction of weighing is used to indicate food and drink furniture and obtains weight of object on this food and drink furniture, and this weight of object is sent to this terminal, this food and drink furniture comprises at least one in dining table and chair, and this chair is positioned at the predeterminable range scope around this dining table.
A kind of diet method for supervising that disclosure embodiment provides, camera can send to food and drink furniture instruction of weighing, so that food and drink furniture obtains the weight of object on this food and drink furniture, and this weight of object is sent to this terminal, and then improve the accuracy of diet monitoring.
Optionally, this instruction of weighing also comprises moment mark, and it is this dining start time or this dining finish time that this moment mark is used to indicate the moment sending instruction of weighing.
A kind of diet method for supervising that disclosure embodiment provides, food and drink furniture can generate weighing information according to this moment mark, and then improves the accuracy of diet monitoring.
Optionally, the method also comprises:
Obtain the Region dividing of the desktop of dining table in this dining table top original image pre-set;
According to this Region dividing, determine each subimage region in this dining table top original image in this N number of subimage;
Obtain the Region dividing of this dining table top original image pre-set and numbering corresponding relation;
According to this corresponding relation, determine the numbering of this each subimage region in this dining table top original image;
Generate this catering information, this catering information at least comprises: the numbering of this each subimage region in this dining table top original image.
A kind of diet method for supervising that disclosure embodiment provides, camera comprises the numbering of each subimage region in this dining table top original image to the catering information that terminal sends, so that each subimage mates with corresponding weighing information according to this zone number by terminal, and then improve the accuracy of diet monitoring.
Optionally, this catering information also comprises: this N number of subimage, the diet label one_to_one corresponding of this N number of subimage and this N number of subimage.
In sum, a kind of diet method for supervising that disclosure embodiment provides, camera can compare the dining dining table top image in moment and dining table top original image whether distinct, and this dining table top image and dining table top original image distinct time, obtain difference image, this difference image comprises: N number of subimage, camera can determine the diet label of each subimage in this N number of subimage further afterwards, and sending catering information to terminal, this catering information comprises the diet label of this N number of subimage.Therefore, user, without the need to manually to terminal input diet record, simplifies the operating process of diet monitoring, improves the efficiency of diet monitoring.
The process flow diagram of the another kind of diet method for supervising of Fig. 3 according to an exemplary embodiment, the method can be applied in the food and drink furniture 02 shown in Fig. 1, and as described in Figure 3, the method comprises:
In step 301, receive instruction of weighing, this instruction of weighing is sent in dining start time or dining finish time by camera.
In step 302, according to this instruction of weighing, Weighing module is adopted to obtain the weight of the object on current time food and drink furniture.
In step 303, send weighing information, this weighing information comprises: the weight of object on this food and drink furniture of current time and the mark of this food and drink furniture.
Such as, weighing information is sent to terminal.
In sum, a kind of diet method for supervising that disclosure embodiment provides, after food and drink furniture receives the instruction of weighing of camera transmission, Weighing module can be adopted to obtain the weight of the object on current time food and drink furniture, and sending weighing information to terminal, this weighing information comprises: the weight of object on this food and drink furniture of current time and the mark of this food and drink furniture.Therefore the diet of weighing information to user that terminal can provide according to food and drink furniture is monitored, and improves the accuracy of diet monitoring.
Optionally, when this food and drink furniture is dining table, the desktop of this dining table is divided into M region, and this Weighing module comprises M sub-Weighing module, and the individual sub-Weighing module of this M respectively one_to_one corresponding is arranged on below this M region,
This is according to this instruction of weighing, and adopts Weighing module to obtain the weight of the object on current time food and drink furniture, comprising:
In this M sub-Weighing module, determine to perceive the sub-Weighing module of target of weight change;
Obtain the weight that the sub-Weighing module of this target each measures.
A kind of diet method for supervising that disclosure embodiment provides, is provided with M sub-Weighing module in dining table, dining table can obtain in this M sub-Weighing module, and the weight that the sub-Weighing module of each target measures, therefore improves the accuracy of diet monitoring.
Optionally, it is characterized in that, the method also comprises:
According to the Region dividing of the desktop of this dining table pre-set, determine the sub-Weighing module region of this target;
Obtain the Region dividing of this dining table top original image pre-set and numbering corresponding relation;
According to this corresponding relation, determine the numbering of the sub-Weighing module region of this target each;
Generate this weighing information, this weighing information at least comprises: the numbering of the sub-Weighing module region of this target each, the weight measured with the sub-Weighing module of this target each.
A kind of diet method for supervising that disclosure embodiment provides, the numbering of the sub-Weighing module region of target is also comprised in the weighing information that dining table generates, so that catering information according to this zone number, can mate with weighing information by terminal, improve the accuracy of diet monitoring.
Optionally, when this food and drink furniture is chair,
This is according to this instruction of weighing, and adopts Weighing module to obtain the weight of food and drink furniture current time placement object, comprising:
According to this instruction of weighing, the Weighing module below the seat being arranged on this chair is adopted to obtain the weight of the human body that this chair current time bears.
A kind of diet method for supervising that disclosure embodiment provides, chair can by the Weighing module be arranged on below seat obtain respectively user have dinner before and after body weight, improve diet monitoring accuracy.
Optionally, this instruction of weighing also comprises moment mark, and it is this dining start time or this dining finish time that this moment mark is used to indicate the moment sending instruction of weighing;
This weighing information also comprises: this moment identifies.
In sum, a kind of diet method for supervising that disclosure embodiment provides, after food and drink furniture receives the instruction of weighing of camera transmission, Weighing module can be adopted to obtain the weight of the object on current time food and drink furniture, and sending weighing information to terminal, this weighing information comprises: the weight of object on this food and drink furniture of current time and the mark of this food and drink furniture.Therefore the diet of weighing information to user that terminal can provide according to food and drink furniture is monitored, and improves the accuracy of diet monitoring.
The process flow diagram of another the diet method for supervising of Fig. 4 according to an exemplary embodiment, the method can be applied in the terminal 03 shown in Fig. 1, and as Fig. 4 is somebody's turn to do, the method comprises:
In step 401, receive catering information, this catering information comprises the diet label of N number of subimage.
Such as, the catering information that camera sends is received.
In step 402, this catering information is shown.
In sum, a kind of diet method for supervising that disclosure embodiment provides, after terminal receives the catering information of camera transmission, can show this catering information, this catering information comprises: the diet label of N number of subimage.Terminal can show diet record intuitively, improves the intuitive of diet monitoring.
Optionally, the method also comprises:
Receive weighing information, this weighing information comprises: the weight of object on this food and drink furniture of current time and the mark of this food and drink furniture, and this weight is that this food and drink furniture adopts Weighing module to obtain;
Such as, the weighing information that food and drink furniture sends is received.
Show this weighing information.
A kind of diet method for supervising that disclosure embodiment provides, terminal can show this weighing information after receiving the weighing information of food and drink furniture transmission intuitively, improves the intuitive of diet monitoring.
Optionally, this catering information also comprises:
This N number of subimage, the diet label one_to_one corresponding of this N number of subimage and this N number of subimage;
This catering information of this display, comprising:
The diet label one_to_one corresponding of this N number of subimage and this N number of subimage is shown.
A kind of diet method for supervising that disclosure embodiment provides, the diet label one_to_one corresponding of N number of subimage and this N number of subimage can show by terminal, improves accuracy and the intuitive of diet monitoring.
Optionally, this food and drink furniture comprises at least one in dining table and chair, when this food and drink furniture is dining table, the desktop of this dining table is divided into M region, this Weighing module comprises M sub-Weighing module, and the individual sub-Weighing module of this M respectively one_to_one corresponding is arranged on below this M region
This catering information at least comprises: the numbering of each subimage region in this dining table top original image in this N number of subimage, image when this dining table top original image is non-placing articles on this dining table, this weighing information at least comprises: the numbering of the sub-Weighing module region of each target, the weight measured with the sub-Weighing module of this target each, the sub-Weighing module of this target is the sub-Weighing module perceiving weight change in this M sub-Weighing module;
This weighing information of this display, comprising:
The numbering of this each target sub-Weighing module region is mated with the numbering of each subimage region in this dining table top original image in this N number of subimage, determines the target designation that the numbering of region in this dining table top original image in this N number of subimage is identical with the numbering of this target sub-Weighing module region;
The weight that the sub-Weighing module of target that the subimage that this target designation of association display is corresponding is corresponding with this target designation measures.
A kind of diet method for supervising that disclosure embodiment provides, terminal can associate the weight that subimage corresponding to the display-object numbering target sub-Weighing module corresponding with this target designation measures, and improves accuracy and the intuitive of diet monitoring.
Optionally, this weighing information also comprises: the moment identifies, and this moment mark is used to indicate camera and sends the moment of instruction of weighing to this food and drink furniture for dining start time or dining finish time, and the method also comprises:
In the weighing information received, by the moment of moment mark instruction be dining start time this food and drink furniture of current time of comprising of weighing information on the weight of object be defined as the first weight;
In the weighing information received, by the moment of moment mark instruction be dining finish time this food and drink furniture of current time of comprising of weighing information on the weight of object be defined as the second weight;
The difference of this first weight and this second weight is defined as intake;
According to this intake, generate dietary recommendation information;
Show this dietary recommendation information.
A kind of diet method for supervising that disclosure embodiment provides, terminal according to the catering information received and weighing information, can also generate and show dietary recommendation information, having enriched the displaying contents of terminal.
Optionally, this weighing information also comprises: the moment identifies, this moment mark is used to indicate camera and sends the moment of instruction of weighing to this food and drink furniture for dining start time or dining finish time, when this food and drink furniture is dining table, the difference of this first weight and this second weight is defined as intake by this, comprising:
In the weighing information received, by the moment of moment mark instruction be dining start time this food and drink furniture of current time of comprising of weighing information on the weight of object be defined as the first weight;
In the weighing information received, by the moment of moment mark instruction be dining finish time this food and drink furniture of current time of comprising of weighing information on the weight of object be defined as the second weight;
Obtain in this first weight, the first sub-weight that the sub-Weighing module of the target that this target designation is corresponding measures;
Obtain in this second weight, the second sub-weight that the sub-Weighing module of the target that this target designation is corresponding measures;
The difference of this first sub-weight and this second sub-weight is defined as target intake;
According to the diet label of this target intake subimage corresponding with this target designation, generate dietary recommendation information;
Show this dietary recommendation information.
In sum, a kind of diet method for supervising that disclosure embodiment provides, terminal is except showing the catering information received, and the weighing information corresponding with this catering information, the dietary recommendation information generated according to this catering information and weighing information can also be shown, enriched displaying contents during diet monitoring.
The process flow diagram of another the diet method for supervising of Fig. 5-1 according to an exemplary embodiment, the method can be applied in the implementation environment shown in Fig. 1, and as described in Fig. 5-1, the method comprises:
In step 501, camera compare the dining dining table top image in moment and dining table top original image whether distinct.
Desktop picture when this dining table top original image is non-placing articles on this dining table.In the disclosed embodiments, this dining table top original image can be previously stored with in camera, the dining table top in current collection moment and dining table top original image, when gathering dining table top image, after determining that the current collection moment is for the dining moment, can compare by camera.Wherein, camera each can gather the dining table top image in moment by contrast, determines whether the current collection moment is the dining moment, dining start time and dining finish time.A kind of camera of Fig. 5-2 according to an exemplary embodiment determines the moment of having dinner, the method flow diagram of dining start time and dining finish time, and as shown in Fig. 5-2, the method comprises:
In step 5011, camera take preset duration as one-period length, periodically gathers dining table top image.
This preset duration can be 1 minute, and can be also 1 second, disclosure embodiment limit.Example, suppose that this preset duration is 1 second, then camera can gather a dining table top image every 1 second.
In step 5012, judge whether the area that the current dining table top image and upper gathering the moment gathers the difference image of the dining table top image in moment is greater than preset area threshold value.
When the area that the dining table top image and upper gathering the moment in this prior gathers the difference image of the dining table top image in moment is greater than this preset area threshold value, perform step 5013; When the area that the dining table top image and upper gathering the moment in this prior gathers the difference image of the dining table top image in moment is not more than this preset area threshold value, perform step 5016.
In the disclosed embodiments, current collection the moment dining table top image with on an area gathering the difference image of the dining table top image in moment be greater than preset area threshold value time, camera can determine that the image of this dining table there occurs change, and performs step 5013; When the area of this difference image is not more than preset area threshold value, camera can determine that this dining table top image does not change, and performs step 5016.
In step 5013, determine that this current collection moment is this dining moment.Perform step 5014.
Current collection the moment dining table top image with on an area gathering the difference image of the dining table top image in moment be greater than preset area threshold value time, camera can determine this current collections moment be have dinner the moment.Example, suppose that the dining table top image that camera gathers when 12:00 and the area of the difference image of dining table top image gathered for first 1 second are greater than preset area threshold value, camera 12:00 can be defined as have dinner the moment, and this dining moment dining table top image can be compared and dining table top original image whether distinct.
It should be noted that, in actual applications, in order to determine the moment of having dinner more accurately, camera judge the dining table top image in current collection moment with on an area gathering the difference image of the dining table top image in moment be greater than preset area threshold value time, can continue to contrast multiple dining table top image in moment of gathering.Example, camera can continue to contrast the dining table top image in current collection moment and whether area that next gathers the difference image of the dining table top image in moment is greater than preset area threshold value, and when the area of this difference image is greater than preset area threshold value, next gathers the moment for the dining moment to determine this.
In step 5014, judge whether the dining table top image in a collection moment on this and the upper area gathering the difference image of the dining table top image in moment are greater than preset area threshold value.
After camera determines the dining moment, dining start time and dining finish time can also be determined further.On this dining table top image gathering the moment with on this on gather the difference image of the dining table top image in moment area be not more than this preset area threshold value time, execution step 5015; On this dining table top image gathering the moment with on this on gather the difference image of the dining table top image in moment area be greater than this preset area threshold value time, perform step 5011, namely continue collection dining table top image.
In step 5015, determine that this current collection moment is for dining start time.
The area that the dining table top image and upper gathering the moment in this prior gathers the difference image of the dining table top image in moment is greater than this preset area threshold value, and on this one gather the moment dining table top image and on this, the upper area gathering the difference image of the dining table top image in moment is not more than this preset area threshold value time, camera can determine that this current collection moment is for dining start time, if namely camera was determined compared with a upper moment, current time dining table image there occurs change, and the dining table image in a upper moment and upper upper moment does not change, then camera can determine that the current collection moment is for dining start time.
In step 5016, judge on this, whether a collection moment is the dining moment.
When the area that the dining table top image and upper gathering the moment in this prior gathers the difference image of the dining table top image in moment is not more than this preset area threshold value, camera can continue to judge on this, whether a collection moment is the dining moment, when a collection moment is for the dining moment on this, perform step 5017; When a collection moment is for the dining moment on this, perform step 5011, namely continue collection dining table top image.
In step 5017, determine that this current collection moment is for dining finish time.
When on this, a collection moment is for the dining moment, camera can determine that the current collection moment is for dining finish time.The area that the dining table top image and upper gathering the moment in this prior gathers the difference image of the dining table top image in moment is not more than this preset area threshold value, and when camera judges that on this, collection moment is as the dining moment, camera can determine that this current collection moment is for dining finish time.Example, suppose that the dining table top image that camera gathers at 12:30 is not more than preset area threshold value with the area of the difference image of the dining table top image gathered upper one second, and camera judges that upper one second is as the dining moment, then camera can determine current time further: 12:30 is dining finish time.
In step 502, camera this dining table top image and dining table top original image distinct time, obtain difference image, this difference image comprises: N number of subimage.
The area of the arbitrary subimage in this N number of subimage is less than the area of this original image, this N be greater than or equal to 1 integer.The pixel that each pixel in this dining table top image is corresponding with dining table top original image can contrast by camera, and the image that different pixels forms is defined as difference image, this difference image can comprise N number of subimage, and the area of arbitrary subimage is less than the area of this original image, therefore can avoid when dining table has changed tablecloth, camera using the image of tablecloth as difference image.Example, the schematic diagram of a kind of difference image of Fig. 5-3 according to an exemplary embodiment, suppose that dining table top original image 120 is the white image of a rectangle, camera carves the dining table top image of collection at mealtime as shown in Fig. 5-3, the difference image that then camera is determined can be 100, and this difference image 100 comprises 4 subimages: subimage 101 to subimage 104.
In step 503, camera determines the diet label of each subimage in this N number of subimage.
In the disclosed embodiments, multiple diet label is previously stored with in camera, this diet label is the title of some staple foods, vegetable, fruit and beverage common in diet, such as: rice, steamed bun, noodles, green vegetables, vegetables fry meat, fish, Pizza and soup etc.Camera can be analyzed the color of each subimage in this N number of subimage and texture, and then determine the diet label of each subimage after obtaining difference image.Can also store the reference picture that each diet label is corresponding in camera, reference picture corresponding for each to each subimage and this diet label can contrast by camera one by one, and then determines the diet label of each subimage.Example, for the subimage of 4 shown in Fig. 5-3, camera can determine that the diet label of subimage 101 to 104 is respectively: steamed bun, fish, green vegetables and Pizza.
In step 504, camera sends catering information to terminal.
This catering information comprises the diet label of this N number of subimage.Example, camera can comprise the diet label of subimage 101 to 104 to the catering information that terminal sends: steamed bun, fish, green vegetables and Pizza.
In the disclosed embodiments, in order to improve the accuracy of diet monitoring, the numbering of each subimage region in this dining table top original image in the catering information that camera sends to terminal, can also be comprised.A kind of camera of Fig. 5-4 according to an exemplary embodiment generates the method flow diagram of catering information, and as shown in Fig. 5-4, the method comprises:
In step 5041, obtain the Region dividing of the desktop of dining table in this dining table top original image pre-set.
In the disclosed embodiments, for the ease of adding up the intake of diet corresponding to each subimage, the desktop of dining table can be divided into multiple region, each region correspondence is provided with Weighing module, for obtaining the weight of the article that this region is placed, the Region dividing of this dining table top original image can be previously provided with in camera, and the corresponding relation of this Region dividing and numbering.Example, the Region dividing schematic diagram of the desktop of a kind of dining table of Fig. 5-5 according to an exemplary embodiment, as illustrated in fig. 5-5, in dining table top original image, the desktop 120 of dining table can be divided into 8 regions, and what these 8 regions were corresponding is numbered 1-8.
In step 5042, according to this Region dividing, determine each subimage region in this dining table top original image in this N number of subimage.
Camera can according to this Region dividing, and in the dining table top image collected, the position of each subimage in this desktop picture, determines each subimage region in this dining table top original image in this N number of subimage.
In step 5043, obtain the Region dividing of this dining table top original image pre-set and numbering corresponding relation.
Example, Region dividing and the numbering corresponding relation of this dining table top original image pre-set that camera obtains can as illustrated in fig. 5-5, and the desktop division of dining table is in order to 8 regions, and numbering corresponding to these 8 regions is respectively 1-8.
In step 5044, according to this corresponding relation, determine the numbering of this each subimage region in this dining table top original image.
Example, suppose that difference image that camera determines is as shown in Fig. 5-3, the Region dividing of dining table top as illustrated in fig. 5-5, then camera can be determined in these 4 subimages, subimage 1 in this dining table top original image region be numbered 3, subimage 2 in this dining table top original image region be numbered 4, subimage 3 in this dining table top original image region be numbered 5, subimage 4 in this dining table top original image region be numbered 6.
In step 5045, generate this catering information.
This catering information at least comprises: the numbering of each subimage region in this dining table top original image.This catering information also comprises: this N number of subimage, the diet label one_to_one corresponding of this N number of subimage and this N number of subimage.Example, the catering information generated in camera can be as shown in table 1, this catering information comprises 4 subimages, the diet label that these 4 subimages are corresponding respectively: steamed bun, fish, green vegetables and Pizza, and the numbering of this 4 subimage region in this dining table top original image: 3,4,5,6.
Table 1
Subimage Diet label The numbering of region
Subimage 1 Steamed bun 3
Subimage 2 Fish 4
Subimage 3 Green vegetables 5
Subimage 4 Pizza 6
It should be noted that, camera can also send directly to terminal the dining table top image collected, then perform the content indicated by above-mentioned steps 502 to 503 by terminal, namely obtain difference image by terminal, and determine the diet label of the N number of subimage in this difference image.
In step 505, this catering information of terminal demonstration.
After terminal receives catering information, the diet label one_to_one corresponding of N number of subimage that this catering information can be comprised and this N number of subimage shows.The schematic diagram of a kind of terminal demonstration this catering information of Fig. 5-6 according to an exemplary embodiment, as seen in figs. 5-6, in terminal, one_to_one corresponding shows the diet label of 4 subimages and these 4 subimages: steamed bun, fish, green vegetables and Pizza.
It should be noted that, in actual applications, when after this catering information of terminal demonstration, if user finds the situation that subimage is not corresponding with diet label, can also modify to the diet label of this subimage, input correct diet label, terminal then can store the corresponding relation of the diet label that this subimage and user input, when again receiving the subimage similar to this subimage, correspondingly can show this diet label stored, and then improve the accuracy of diet monitoring.
In step 506, camera sends respectively to food and drink furniture instruction of weighing in dining start time and dining finish time.
This instruction of weighing is used to indicate food and drink furniture and obtains weight of object on this food and drink furniture, and is sent to terminal, and this food and drink furniture comprises at least one in dining table and chair, and this chair is positioned at the predeterminable range scope around this dining table.This instruction of weighing also comprises moment mark, and it is this dining start time or this dining finish time that this moment mark is used to indicate the moment sending instruction of weighing.Example, suppose that the dining start time that camera is determined is 12:00, then camera can send to food and drink furniture instruction of weighing when 12:00, and this instruction of weighing can be: weigh, dining start time.
In step 507, food and drink furniture, according to this instruction of weighing, adopts Weighing module to obtain the weight of the object on current time food and drink furniture.
In the disclosed embodiments, in food and drink furniture, be provided with Weighing module, send weigh after instruction when this food and drink furniture receives camera, Weighing module can be adopted to obtain weight of object on current time food and drink furniture.When this food and drink furniture is dining table, the desktop of this dining table is divided into M region, this Weighing module comprises M sub-Weighing module, the individual sub-Weighing module of this M respectively one_to_one corresponding is arranged on below this M region, a kind of food and drink furniture of Fig. 5-7 according to an exemplary embodiment obtains the method flow diagram of the weight of object, as illustrated in figs. 5-7, when this food and drink furniture is dining table, the method that dining table adopts Weighing module to obtain the weight of the object on current time food and drink furniture can comprise:
In step 5071, in M sub-Weighing module, determine to perceive the sub-Weighing module of target of weight change.
When food and drink furniture is dining table, this dining table top is divided into M region, the individual sub-Weighing module of M respectively one_to_one corresponding is arranged on below this M region, owing to carving at mealtime, may only have on subregion and placed article, therefore dining table in this M sub-Weighing module, can be determined the sub-Weighing module perceiving weight change, and this sub-Weighing module is defined as the sub-Weighing module of target.Example, suppose that dining table top is divided in order to 8 regions, wherein placed article in the 3-6 of region, then dining table can determine to perceive the sub-Weighing module of target of weight change is the sub-Weighing module be arranged on below the 3-6 of region.
In step 5072, obtain the weight that the sub-Weighing module of each target measures.
After the sub-Weighing module of dining table determination target, the weight that the sub-Weighing module of each target measures can be obtained, and the weight that all sub-Weighing modules measure need not be obtained, therefore improve the efficiency that dining table obtains weight of object.Example, the sub-Weighing module of hypothetical target is the sub-Weighing module be arranged on below the 3-6 of region, the weight that these 4 sub-Weighing modules of target measure is respectively: 800 grams (g), 700g, 500g, 1000g, then dining table adopt Weighing module to obtain current time dining table on the weight of object can be: 800g, 700g, 500g, 1000g.
When this food and drink furniture is chair, chair can, according to the instruction of weighing received, adopt the Weighing module below the seat being arranged on this chair to obtain the weight of the human body that this chair current time bears.
In the disclosed embodiments, when this food and drink furniture is chair, this chair is positioned at the predeterminable range scope around this dining table, and the quantity of this chair can be greater than 1, after chair receives instruction of weighing, each chair first can judge whether the Weighing module be arranged on below seat perceives weight change, when perceiving weight change, the weight that this chair current time bears is obtained again by Weighing module, under normal circumstances, the weight of what chair bore is human body.Example, suppose that predeterminable range scope is 50 centimetres, and there are 4 chairs to be positioned at the scope of distance 50 centimetres, dining table, this 4 is respectively chair 1 to chair 4 the mark of chair, wherein chair 1 and chair 3 perceive weight change, the weight that then this chair 1 and chair 3 adopt the Weighing module be arranged on below seat to obtain the human body that this chair current time bears can be: 49.5 kilograms (kg), 70kg.
In step 508, food and drink furniture sends weighing information to terminal.
Food and drink furniture receives instruction of weighing, and after adopting Weighing module to get the weight of the object on current time food and drink furniture according to this instruction of weighing, can generate weighing information, and send this weighing information to terminal according to the weight of this object.This weighing information comprises: the weight of object on this food and drink furniture of current time and the mark of this food and drink furniture.When this food and drink furniture comprises dining table and chair, the mark of this food and drink furniture can be dining table, chair, and when the quantity of this chair is greater than 1, the mark of chair can also comprise the numbering of each chair, such as chair 1 and chair 2.This weighing information also comprises: the moment identifies.This moment is designated the moment mark that instruction of weighing that food and drink furniture receives comprises, and it is this dining start time or this dining finish time that this moment mark is used to indicate the moment sending instruction of weighing.Example, suppose that the instruction of weighing that chair receives is: weigh, dining start time, then the weighing information that chair sends to terminal can be: dining start time; Chair 1:49.5 kilogram (kg); Chair 2:70kg.
In practical application, in dining process, some chairs may be used to placing articles, as satchel etc., the weight of these article is usually less compared to the weight of human body, each chair is after perceiving weight change, the weight that can first adopt the Weighing module be arranged on below seat to obtain this chair current time to bear, judge whether this weight is greater than preset weight threshold value (as 20kg), if this weight is greater than preset weight threshold value, what determine that this chair bears is the weight of human body, corresponding weighing information is sent to terminal, if this weight is less than preset weight threshold value, what then determine that chair bears is not the weight of human body, this weighing information can be deleted, without the need to sending this weighing information to terminal.
Need illustrate time, when this food and drink furniture is dining table, the weighing information that this dining table generates also comprises the numbering of the sub-Weighing module region of each target, and a kind of dining table of 5-8 according to an exemplary embodiment generates the method flow diagram of weighing information, as viewed in figures 5-8, the method comprises:
In step 5081, dining table, according to the Region dividing of the desktop of this dining table pre-set, determines the sub-Weighing module region of target.
Example, suppose that dining table top is divided in order to 8 regions, 4 regions being wherein arranged in dining table top middle part placed article, then dining table can determine to perceive 4 regions that the region at the sub-Weighing module place of target of weight change is dining table middle part.
In step 5082, obtain the Region dividing of this dining table top original image pre-set and numbering corresponding relation.
Example, the Region dividing of the dining table top original image pre-set in this dining table and the corresponding relation of numbering can shown in Fig. 5-5, and the numbering being wherein positioned at the region on dining table both sides is respectively 1,2,7,8, and the zone number being positioned at dining table middle part is 3-6.
In step 5083, according to this corresponding relation, determine the numbering of the sub-Weighing module region of this target each.
Example, dining table can determine that the numbering of the sub-Weighing module region of 4 targets being positioned at dining table middle part is respectively: 3,4,5,6.
In step 5084, generate this weighing information, this weighing information at least comprises: the numbering of the sub-Weighing module region of this target each, the weight measured with the sub-Weighing module of this target each.
In the disclosed embodiments, the weighing information that dining table generates comprises: the weight of the object on this dining table of current time, the mark of dining table and moment mark, wherein, the weight of the object on this dining table of current time comprises: the numbering of the sub-Weighing module region of this target each, the weight measured with the sub-Weighing module of this target each.Example, suppose that the instruction of weighing that dining table receives is: dining start time, weigh, then the weighing information that dining table generates can be: 3:800g, 4:700g, 5:500g, 6:1000g; Dining table; Dining start time.
In step 509, this weighing information of terminal demonstration.
Terminal can also show this weighing information after receiving the weighing information of food and drink furniture transmission, reaches the effect of prompting user with this.Example, the schematic diagram of a kind of terminal demonstration weighing information of Fig. 5-9 according to an exemplary embodiment, as shown in figures 5-9, supposes that terminal receives the weighing information of chair transmission: dining start time; Chair 1:49.5 kilogram (kg); Chair 2:70kg, then can show this weighing information in the interface shown in Fig. 5-9.
In the disclosed embodiments, when this food and drink furniture is dining table, the desktop of this dining table is divided into M region, this Weighing module comprises M sub-Weighing module, the individual sub-Weighing module of this M respectively one_to_one corresponding is arranged on below this M region, the catering information that dining table sends at least comprises: the numbering of each subimage region in this dining table top original image in this N number of subimage, image when this dining table top original image is non-placing articles on this dining table, the weighing information that dining table sends at least comprises: the numbering of the sub-Weighing module region of each target, the weight measured with the sub-Weighing module of this target each, the sub-Weighing module of this target is the sub-Weighing module perceiving weight change in this M sub-Weighing module.Therefore, terminal, when showing the weighing information that dining table sends, needs the catering information that this weighing information and dining table send to carry out associating rear display.Fig. 5-10 is method flow diagrams of a kind of terminal demonstration weighing information according to an exemplary embodiment, and as shown in Figure 5-10, the method comprises:
In step 5091, the numbering of the numbering of sub-for each target Weighing module region with the region in this dining table top original image of each subimage in N number of subimage is mated, determines the target designation that the numbering of region in this dining table top original image in this N number of subimage is identical with the numbering of this target sub-Weighing module region.
Example, suppose that the catering information that the dining table that terminal receives sends is as shown in table 1, this catering information comprises 4 subimages, the numbering of the diet label that these 4 subimages are corresponding and region, the weighing information that the dining table that terminal receives sends is: 3:800g, 4:700g, 5:500g, 6:1000g; Dining table; Dining start time, then terminal can determine that the target designation that the numbering of region in this dining table top original image in these 4 subimages is identical with the numbering of this target sub-Weighing module region is: 3,4,5,6.
In step 5092, the weight that the sub-Weighing module of target that the subimage that this target designation of association display is corresponding is corresponding with this target designation measures.
Fig. 5-11 is schematic diagram of the another kind of terminal demonstration weighing information according to an exemplary embodiment, as shown in Figure 11-Figure 5, terminal can associate display-object numbering: 3, the subimage of 4,5,6 correspondences and this target designation: 3, the weight that measures of the sub-Weighing module of target of 4,5,6 correspondences: 800g, 700g, 500g, 1000g.
In the disclosed embodiments, the weighing information that food and drink furniture sends to terminal also comprises: the moment identifies, moment that terminal can comprise according to this weighing information mark is analyzed the weighing information received and catering information, to provide dietary recommendation information for user.
In step 510, terminal in the weighing information received, by the moment of moment mark instruction be dining start time this food and drink furniture of current time of comprising of weighing information on the weight of object be defined as the first weight.
In the disclosed embodiments, when terminal receives at least two weighing information of food and drink furniture transmission, terminal can detect in real time to the moment mark in the weighing information received, and distance current time nearest dining start time and dining finish time can be obtained respectively, when this dining finish time is later than this dining start time, the weight of the object on this food and drink furniture of the current time that the weighing information of this dining start time can comprise by terminal is defined as the first weight.Example, Fig. 5-12 is schematic diagram that a kind of terminal according to an exemplary embodiment receives the moment of weighing information, suppose that current time is 13:00, terminal have received 4 weighing information, moment mark in these four weighing information and terminal receive moment of these four weighing information can be as is shown in figures 5-12, moment wherein in weighing information 1 is designated dining start time, terminal receives the time 09:00 of this weighing information, then in these 4 weighing information, terminal can obtain distance current time 13:00 nearest dining start time respectively: 12:00 and dining finish time: 12:30, this dining start time is the dining start time in weighing information 3, this dining finish time is the dining finish time in weighing information 4, and this dining finish time 12:30 is later than this dining start time 12:00, the weight of the object then on terminal this food and drink furniture of current time that the weighing information 3 of this dining start time 12:00 can be comprised is defined as the first weight.Suppose that weighing information 3 is: dining start time; Chair 1:49.5 kilogram (kg); Chair 2:70kg, then 49.5kg and 70kg can be defined as the first weight by terminal.
In step 511, terminal in the weighing information received, by the moment of moment mark instruction be dining finish time this food and drink furniture of current time of comprising of weighing information on the weight of object be defined as the second weight.
Identical with the determination methods in step 510, terminal can obtain distance current time nearest dining start time and dining finish time respectively in the weighing information received, when this dining finish time is later than this dining start time, the weight of the object on this food and drink furniture of the current time that the weighing information of this dining finish time can comprise by terminal is defined as the second weight.Example, in the weighing information shown in Fig. 5-12, the weight of the object on this food and drink furniture of current time that the weighing information 4 of this dining finish time 12:30 can comprise by terminal is defined as the second weight.Suppose that weighing information 4 is: dining finish time; Chair 1:50 kilogram (kg); Chair 2:71kg, then 50kg and 71kg can be defined as the second weight by terminal.
In step 512, the difference of this first weight and this second weight is defined as intake by terminal.
In the disclosed embodiments, terminal can according to the mark of the food and drink furniture in weighing information, the first corresponding for each food and drink furniture mark weight and this food and drink furniture are identified the difference of the second corresponding weight, be defined as the intake that this food and drink furniture mark is corresponding, this intake is used to indicate the weight of diner's dietary intake.Example, suppose that the first weight that terminal is determined is 49.5kg and 70kg, the second weight is 50kg and 71kg, wherein food and drink furniture mark: the first weight of chair 1 correspondence is 49.5kg, and the second corresponding weight is 50kg; Food and drink furniture identifies: the first weight of chair 2 correspondence is 70kg, the second corresponding weight is 71kg, then the difference of the first weight 49.5kg with the second weight 50kg: 0.5kg can be defined as intake corresponding to chair 1 by terminal, is also the intake of the diner that chair 1 is seated; The difference of the first weight 70kg with the second weight 71kg: 1kg is defined as intake corresponding to chair 2, is also the intake of the diner that chair 2 is seated.
In step 513, terminal, according to this intake, generates dietary recommendation information.
In the disclosed embodiments, can according to the intake in predetermined period in terminal, generate dietary recommendation information, this predetermined period can be 1 hour, also it can be one day or one week, this dietary recommendation information can comprise the information for pointing out user's intake whether to meet preset standard, can also comprise for the advisory information to user's recommended dietary menu.Example, suppose that predetermined period is 1 hour, terminal received 2 weighing information that chair sends in 1 hour, and comprise according to the intake that these 2 weighing information calculate: the intake of chair 1 correspondence: 0.5kg, the intake of chair 2 correspondence: 1kg, suppose when food and drink furniture be designated chair time, for the intake in the predetermined period of 1 hour, preset standard in terminal is 0.8kg, the dietary recommendation information that then terminal generates can be: chair 1: increase intake (corresponding information can be: had enough and just had strength to lose weight), chair 2: reduce intake (corresponding information can be: keep a grip on mouth, moves leg more).
It should be noted that, in actual applications, user can also input health (the such as height of individual in the terminal, body weight, age etc.), medical history (mainly comprises chronic metabolic disease, as diabetes, hypertension, gout, coronary heart disease, hyperlipidemia, hyperthyroidism etc.), allergies information etc., and the chair mark that user commonly uses, so that terminal according to the intake of the health of user, medical history, allergies and user, can provide dietary recommendation information more accurately.
In the step 514, this dietary recommendation information of terminal demonstration.
Fig. 5-13 is schematic diagram of a kind of terminal demonstration dietary recommendation information according to an exemplary embodiment, as shown in figures 5-13, terminal can show the dietary recommendation information corresponding to food and drink furniture mark respectively, the such as mark of food and drink furniture: the dietary recommendation information corresponding to chair 1 is: increase intake (having had enough just has strength to lose weight); The mark of food and drink furniture: the dietary recommendation information corresponding to chair 2 is: reduce intake (keep a grip on mouth, move leg more).
It should be noted that, when this food and drink furniture is dining table, terminal also needs the N number of subimage comprised according to catering information, and the sub-Weighing module of the target that comprises of weighing information measure to weight, generate and show dietary recommendation information, Fig. 5-14 is method flow diagrams of a kind of terminal generation dietary recommendation information according to an exemplary embodiment, and as shown in Fig. 5-14, the method comprises:
In step 5141, in the weighing information received, by the moment of moment mark instruction be dining start time this food and drink furniture of current time of comprising of weighing information on the weight of object be defined as the first weight.
In the disclosed embodiments, terminal can obtain distance current time nearest dining start time and dining finish time respectively, when this dining finish time is later than this dining start time, the weight of the object on this food and drink furniture of the current time that the weighing information of this dining start time can comprise by terminal is defined as the first weight.Example, as is shown in figures 5-12, suppose that current time is 13:00, terminal have received 4 weighing information, moment mark in these four weighing information and terminal receive moment of these four weighing information can be as is shown in figures 5-12, moment wherein in weighing information 1 is designated dining start time, terminal receives the time 09:00 of this weighing information, then in these 4 weighing information, terminal can obtain distance current time nearest dining start time and dining finish time respectively, this dining start time is the dining start time in weighing information 3, this dining finish time is the dining finish time in weighing information 4, and this dining finish time 12:30 is later than this dining start time 12:00, the weight of the object then on terminal this food and drink furniture of current time that the weighing information 3 of this dining start time 12:00 can be comprised is defined as the first weight.Suppose that weighing information 3 is: 3:800g, 4:700g, 5:500g, 6:1000g; Dining table; Dining start time, then 800g, 700g, 500g, 1000g can be defined as the first weight by terminal.
In step 5142, in the weighing information received, by the moment of moment mark instruction be dining finish time this food and drink furniture of current time of comprising of weighing information on the weight of object be defined as the second weight.
Terminal can obtain distance current time nearest dining start time and dining finish time respectively in the weighing information received, when this dining finish time is later than this dining start time, the weight of the object on this food and drink furniture of the current time that the weighing information of this dining finish time can comprise by terminal is defined as the second weight.Example, in the weighing information shown in Fig. 5-12, the weight of the object on this food and drink furniture of current time that the weighing information 4 of this dining finish time 12:30 can comprise by terminal is defined as the second weight.Suppose that weighing information 4 is: 200g, 4:50g, 5:100g, 6:200g, dining table; Dining finish time, then terminal by 200g, 50g, 100g, 200g, can be defined as the second weight.
In step 5143, obtain in this first weight, the first sub-weight that the sub-Weighing module of the target that target designation is corresponding measures.
In the disclosed embodiments, terminal can according in weighing information, the numbering of the sub-Weighing module region of this target each, and with the weight that the sub-Weighing module of this target each measures, get the first sub-weight that mark Weighing module corresponding to each target designation measures.Example, hypothetical target is numbered: 3,4,5,6, then terminal can get the first sub-weight that the sub-Weighing module of target corresponding to each target designation measures, the first sub-weight that wherein the sub-Weighing module of target of target designation 3 correspondence measures is: 800g, the first sub-weight that the sub-Weighing module of the target that target designation 4-6 is corresponding measures is respectively: 700g, 500g, 1000g.
In step 5144, obtain in this second weight, the second sub-weight that the sub-Weighing module of the target that target designation is corresponding measures.
In the disclosed embodiments, terminal can according in weighing information, the numbering of the sub-Weighing module region of this target each, and with the weight that the sub-Weighing module of this target each measures, get the second sub-weight that mark Weighing module corresponding to each target designation measures.Example, hypothetical target is numbered: 3,4,5,6, then terminal can get the second sub-weight that the sub-Weighing module of target corresponding to each target designation measures, the second sub-weight that wherein the sub-Weighing module of target of target designation 3 correspondence measures is: 200g, the second sub-weight that the sub-Weighing module of the target that target designation 4-6 is corresponding measures is respectively: 50g, 100g, 200g.
In step 5145, the difference of this first sub-weight and this second sub-weight is defined as target intake.
Terminal can calculate the difference of the first sub-weight corresponding to each target designation and the second sub-weight respectively, and the difference of this first sub-weight and the second sub-weight is defined as target intake corresponding to this target designation.Example, hypothetical target is numbered 3,4,5,6, then terminal is according to the first sub-weight of these 4 target designations difference correspondences and the second sub-weight, the target intake determined can be as shown in table 2, wherein the target intake of target designation 3 correspondence is 600g, the target intake of target designation 4 to target designation 6 correspondence is respectively: 650g, 400g and 800g.
Table 2
Target designation First sub-weight (g) Second sub-weight (g) Target intake (g)
3 800 200 600
4 700 50 650
5 500 100 400
6 1000 200 800
In step 5146, according to the diet label of this target intake subimage corresponding with this target designation, generate dietary recommendation information.
In the disclosed embodiments, the nutritional labeling list that each diet label is corresponding in terminal, can also be stored, and take in the preset standard of often kind of nutritional labeling in predetermined period.After terminal determines the target intake that each target designation is corresponding, can according to this target intake, and the diet label of subimage corresponding to this target designation, judge whether the intake of various nutritional labeling exceedes preset standard, and generate dietary recommendation information.Example, the diet label of the subimage of hypothetical target numbering 3 correspondence is: steamed bun, the diet label stored in terminal: the nutritional labeling list that steamed bun is corresponding is as shown in table 3, as known from Table 3, protein content in every 100g steamed bun is 7g, fat content is 1.1g, and the content of carbohydrates is 47g.The target intake that then terminal is corresponding according to the target designation shown in table 3, target intake corresponding to known target designation 3 is 600g, can calculate the intake of often kind of nutritional labeling shown in table 3 thus, the intake of such as carbohydrates is: 47 × 6=282g.Suppose that the preset standard taking in carbohydrates in the once dining process stored in terminal is 250g, then terminal can judge that the intake 282g of this carbohydrates has exceeded preset standard 250g, and the dietary recommendation information that now terminal generates can be: the intake reducing carbohydrates.
Table 3
Nutritional labeling (every 100g) Content (every 100g)
Protein (g) 7
Fat (g) 1.1
Carbohydrates (g) 47
Total amount of heat (kilocalorie) 221
It should be noted that, in actual applications, the mark of the food and drink furniture in the weighing information that terminal can also send according to chair, judges the number of dining, and in conjunction with this number of meals and intake, generates dietary recommendation information more accurately.In addition, user can also set some targets in the terminal, as needs fat-reducing etc., also can upload the uncomfortable information of health in real time, is comprehensively analyzed and provide suitable vegetable and cooking methods by terminal in dietary recommendation information.In addition, terminal can also according to the essential information of user, in conjunction with factors such as user on-site season, weathers, whether the nutrition arrangement in comprehensive analysis user's nearest a period of time is balanced, breakfast, lunch and dinner whether rule, whether eating habit is good, and in dietary recommendation information, provide dietary recommendation and the eating habit suggestion of next day, and which dvielement reminding user should supplement emphatically in the recent period, and comprise the food of this element.
In step 5147, show this dietary recommendation information.
Example, Fig. 5-15 is schematic diagram of the another kind of terminal demonstration dietary recommendation information according to an exemplary embodiment, suppose that the dietary recommendation information that terminal generates is: 1. the intake reducing carbohydrates and fat, 2. increase the intake of protein, 3. suggestion supplements: vitamin C, recommends food: Kiwi berry, shaddock, tomato, cucumber, green pepper.Then terminal can show this dietary recommendation information in interface as shown in figs. 5-15.
It should be noted that, the sequencing of the step of the diet method for supervising that disclosure embodiment provides can suitably adjust, step also according to circumstances can carry out corresponding increase and decrease, example, step 506 can perform to step 509 before step 504, also can perform with step 504 or step 505 simultaneously.Anyly be familiar with those skilled in the art in the technical scope that the disclosure discloses, the method changed can be expected easily, all should be encompassed within protection domain of the present disclosure, therefore repeat no more.
In sum, a kind of diet method for supervising that disclosure embodiment provides, camera can compare the dining dining table top image in moment and dining table top original image whether distinct, and this dining table top image and dining table top original image distinct time, obtain difference image, this difference image comprises: N number of subimage, camera can determine the diet label of each subimage in this N number of subimage further afterwards, and send catering information to terminal, and can send to food and drink furniture instruction of weighing in dining beginning and finish time, after food and drink furniture receives the instruction of weighing of camera transmission, Weighing module can be adopted to obtain the weight of the object on current time food and drink furniture, and send weighing information to terminal.The catering information that terminal can send according to camera and the weighing information that food and drink furniture sends generate dietary recommendation information.Therefore, user, without the need to manually to terminal input diet record, simplifies the operating process of diet monitoring, improves efficiency and the accuracy of diet monitoring.
Fig. 6-1 is the schematic diagram of a kind of diet supervising device according to an exemplary embodiment, and as in Figure 6-1, this diet supervising device comprises:
Comparison module 601, whether the dining table top image and the dining table top original image that are configured to compare the dining moment be distinct, desktop picture when this dining table top original image is non-placing articles on this dining table.
First acquisition module 602, be configured to this dining table top image and dining table top original image distinct time, obtain difference image, this difference image comprises: N number of subimage, the area of this subimage arbitrary is less than the area of this original image, this N be greater than or equal to 1 integer.
First determination module 603, is configured to the diet label determining each subimage in this N number of subimage.
First sending module 604, be configured to send catering information, this catering information comprises the diet label of this N number of subimage.
In sum, a kind of diet supervising device that disclosure embodiment provides, camera can compare the dining dining table top image in moment and dining table top original image whether distinct, and this dining table top image and dining table top original image distinct time, obtain difference image, this difference image comprises: N number of subimage, camera can determine the diet label of each subimage in this N number of subimage further afterwards, and sending catering information to terminal, this catering information comprises the diet label of this N number of subimage.Therefore, user, without the need to manually to terminal input diet record, simplifies the operating process of diet monitoring, improves the efficiency of diet monitoring.
Fig. 6-2 is schematic diagram of a kind of diet supervising device according to an exemplary embodiment, and as in fig. 6-2, this diet supervising device comprises:
Comparison module 601, whether the dining table top image and the dining table top original image that are configured to compare the dining moment be distinct, desktop picture when this dining table top original image is non-placing articles on this dining table.
First acquisition module 602, be configured to this dining table top image and dining table top original image distinct time, obtain difference image, this difference image comprises: N number of subimage, the area of this subimage arbitrary is less than the area of this original image, this N be greater than or equal to 1 integer.
First determination module 603, is configured to the diet label determining each subimage in this N number of subimage.
First sending module 604, be configured to send catering information, this catering information comprises the diet label of this N number of subimage.
Acquisition module 605, is configured to take preset duration as one-period length, periodically gathers dining table top image.
Second determination module 606, be configured to current collection the moment dining table top image with on an area gathering the difference image of the dining table top image in moment be greater than this preset area threshold value time, determine that this current collection moment is this dining moment.
3rd determination module 607, to be configured on this dining table top image gathering the moment with on gather the difference image of the dining table top image in moment area be not more than this preset area threshold value time, determine that this current collections moment is for having dinner start time.
First judge module 608, when the area that the dining table top image and upper being configured to gather in this prior the moment gathers the difference image of the dining table top image in moment is not more than this preset area threshold value, to judge on this that one gathers whether the moment is the dining moment.
4th determination module 609, being configured to collection moment on this is when having dinner the moment, determines that this current collection moment is for dining finish time.
Second sending module 610, be configured in this dining start time and this dining finish time, send instruction of weighing respectively, this instruction of weighing is used to indicate food and drink furniture and obtains weight of object on this food and drink furniture, and this weight of object is sent to this terminal, this food and drink furniture comprises at least one in dining table and chair, and this chair is positioned at the predeterminable range scope around this dining table.
Second acquisition module 611, is configured to obtain the Region dividing of the desktop of dining table in this dining table top original image pre-set.
5th determination module 612, is configured to according to this Region dividing, determines each subimage region in this dining table top original image in this N number of subimage.
3rd acquisition module 613, is configured to obtain the Region dividing of this dining table top original image pre-set and numbering corresponding relation.
6th determination module 614, is configured to according to this corresponding relation, determines the numbering of this each subimage region in this dining table top original image.
Generation module 615, be configured to generate this catering information, this catering information at least comprises: the numbering of this each subimage region in this dining table top original image.
Optionally, this instruction of weighing also comprises moment mark, and it is this dining start time or this dining finish time that this moment mark is used to indicate the moment sending instruction of weighing.
Optionally, this catering information also comprises: this N number of subimage, the diet label one_to_one corresponding of this N number of subimage and this N number of subimage.
In sum, a kind of diet supervising device that disclosure embodiment provides, camera can compare the dining dining table top image in moment and dining table top original image whether distinct, and this dining table top image and dining table top original image distinct time, obtain difference image, this difference image comprises: N number of subimage, camera can determine the diet label of each subimage in this N number of subimage further afterwards, and sending catering information to terminal, this catering information comprises the diet label of this N number of subimage.Therefore, user, without the need to manually to terminal input diet record, simplifies the operating process of diet monitoring, improves the efficiency of diet monitoring.
Fig. 7-1 is the block diagram of a kind of food and drink furniture according to an exemplary embodiment, and this food and drink furniture comprises at least one in dining table and chair, as shown in Fig. 7-1,
Weighing module 701 and communication module 702 is provided with in this food and drink furniture.
This Weighing module 701 is connected with this communication module 702.
Optionally, when this food and drink furniture is dining table, the desktop of this dining table is divided into M region, and this Weighing module 701 comprises M sub-Weighing module, and the individual sub-Weighing module of this M respectively one_to_one corresponding is arranged on below this M region; This M sub-Weighing module is connected with this communication module 702 respectively.
Example, Fig. 7-2 is schematic diagram of a kind of dining table according to an exemplary embodiment, suppose that the desktop 700 of this dining table is divided into 8 regions: 1-region, region 8, then this Weighing module 701 comprises 8 sub-Weighing module: 7011-7018, these 8 sub-Weighing module: 7011-7018 respectively one_to_one corresponding are arranged on the below in 1 to region, region 8, communication module 702 can be arranged on the below in arbitrary region in this desktop 700, such as shown in Fig. 7-2, be arranged on the below in region 7017, these 8 sub-Weighing modules are connected by the wire arranged below desktop with this communication module 702 respectively.It should be noted that, this communication module 702 can also be arranged on arbitrary table leg 710 of dining table, and disclosure embodiment does not limit.
Optionally, when this food and drink furniture is chair, this Weighing module is arranged on the below of this seat.
Example, Fig. 7-3 is schematic diagram of a kind of chair according to an exemplary embodiment, as shown in Fig. 7-3, this Weighing module 701 can be arranged on the below of the seat 720 of this chair, communication module 702 can be arranged on the chair back of chair, this communication module 702 can also be arranged on below the seat of this chair, or is arranged in the chair legs of chair, and disclosure embodiment does not limit.
In sum, a kind of food and drink furniture that disclosure embodiment provides, after food and drink furniture receives the instruction of weighing of camera transmission, Weighing module can be adopted to obtain the weight of the object on current time food and drink furniture, and sending weighing information to terminal, this weighing information comprises: the weight of object on this food and drink furniture of current time and the mark of this food and drink furniture.Therefore the diet of weighing information to user that terminal can provide according to food and drink furniture is monitored, and improves the accuracy of diet monitoring.
Fig. 8-1 is the block diagram of another the diet supervising device according to an exemplary embodiment, and as shown in Fig. 8-1, this diet supervising device comprises:
First receiver module 801, be configured to receive catering information, this catering information comprises the diet label of N number of subimage.
First display module 802, is configured to show this catering information.
In sum, a kind of diet supervising device that disclosure embodiment provides, after terminal receives the catering information of camera transmission, can show this catering information, this catering information comprises: the diet label of N number of subimage.Terminal can show diet record intuitively, improves the intuitive of diet monitoring.
Fig. 8-2 is block diagrams of another the diet supervising device according to an exemplary embodiment, and as shown in Fig. 8-2, this diet supervising device comprises:
First receiver module 801, be configured to receive catering information, this catering information comprises the diet label of N number of subimage.
First display module 802, is configured to show this catering information.
Second receiver module 803, be configured to receive weighing information, this weighing information comprises: the weight of object on this food and drink furniture of current time and the mark of this food and drink furniture, and this weight is that this food and drink furniture adopts Weighing module to obtain.
Second display module 804, is configured to show this weighing information.
Optionally, this weighing information also comprises: the moment identifies, and this moment mark is used to indicate camera and sends the moment of instruction of weighing to this food and drink furniture for dining start time or dining finish time, and as shown in Fig. 8-2, this device also comprises:
First determination module 805, is configured in the weighing information received, by the moment of moment mark instruction be dining start time this food and drink furniture of current time of comprising of weighing information on the weight of object be defined as the first weight.
Second determination module 806, is configured in the weighing information received, by the moment of moment mark instruction be dining finish time this food and drink furniture of current time of comprising of weighing information on the weight of object be defined as the second weight.
3rd determination module 807, is configured to the difference of this first weight and this second weight to be defined as intake.
First generation module 808, is configured to according to this intake, generates dietary recommendation information.
3rd display module 809, is configured to show this dietary recommendation information.
Optionally, this weighing information also comprises: the moment identifies, and this moment mark is used to indicate camera and sends the moment of instruction of weighing to this food and drink furniture for dining start time or dining finish time, and as shown in Fig. 8-2, this device also comprises:
4th determination module 810, is configured in the weighing information received, by the moment of moment mark instruction be dining start time this food and drink furniture of current time of comprising of weighing information on the weight of object be defined as the first weight.
5th determination module 811, is configured in the weighing information received, by the moment of moment mark instruction be dining finish time this food and drink furniture of current time of comprising of weighing information on the weight of object be defined as the second weight.
First acquisition module 812, is configured to obtain in this first weight, the first sub-weight that the sub-Weighing module of the target that this target designation is corresponding measures.
Second acquisition module 813, is configured to obtain in this second weight, the second sub-weight that the sub-Weighing module of the target that this target designation is corresponding measures.
6th determination module 814, is configured to the difference of this first sub-weight and this second sub-weight to be defined as target intake.
Second generation module 815, is configured to the diet label according to this target intake subimage corresponding with this target designation, generates dietary recommendation information.
4th display module 816, is configured to show this dietary recommendation information.
Optionally, this catering information also comprises:
This N number of subimage, the diet label one_to_one corresponding of this N number of subimage and this N number of subimage;
This first display module 802, is configured to:
The diet label one_to_one corresponding of this N number of subimage and this N number of subimage is shown.
Optionally, this food and drink furniture comprises at least one in dining table and chair, when this food and drink furniture is dining table, the desktop of this dining table is divided into M region, this Weighing module comprises M sub-Weighing module, and the individual sub-Weighing module of this M respectively one_to_one corresponding is arranged on below this M region
This catering information at least comprises: the numbering of each subimage region in this dining table top original image in this N number of subimage, image when this dining table top original image is non-placing articles on this dining table, this weighing information at least comprises: the numbering of the sub-Weighing module region of each target, the weight measured with the sub-Weighing module of this target each, the sub-Weighing module of this target is the sub-Weighing module perceiving weight change in this M sub-Weighing module.
This second display module 804, is configured to:
The numbering of this each target sub-Weighing module region is mated with the numbering of each subimage region in this dining table top original image in this N number of subimage, determines the target designation that the numbering of region in this dining table top original image in this N number of subimage is identical with the numbering of this target sub-Weighing module region.
The weight that the sub-Weighing module of target that the subimage that this target designation of association display is corresponding is corresponding with this target designation measures.
In sum, a kind of diet supervising device that disclosure embodiment provides, after terminal receives the catering information of camera transmission, can show this catering information, this catering information comprises: the diet label of N number of subimage.Terminal can show diet record intuitively, improves the intuitive of diet monitoring.
About the device in above-described embodiment, wherein the concrete mode of modules executable operations has been described in detail in about the embodiment of the method, will not elaborate explanation herein.
Fig. 9 is the block diagram of a kind of diet supervising device 900 according to an exemplary embodiment.Such as, device 900 can be video camera, mobile phone, computing machine, digital broadcast terminal, messaging devices, game console, tablet device, Medical Devices, body-building equipment, personal digital assistant etc.
With reference to Fig. 9, device 900 can comprise following one or more assembly: processing components 902, storer 904, power supply module 906, multimedia groupware 908, audio-frequency assembly 910, the interface 912 of I/O (I/O), sensor module 914, and communications component 916.
The integrated operation of the usual control device 900 of processing components 902, such as with display, call, data communication, camera operation and record operate the operation be associated.Processing components 902 can comprise one or more processor 920 to perform instruction, to complete all or part of step of above-mentioned method.In addition, processing components 902 can comprise one or more module, and what be convenient between processing components 902 and other assemblies is mutual.Such as, processing components 902 can comprise multi-media module, mutual with what facilitate between multimedia groupware 908 and processing components 902.
Storer 904 is configured to store various types of data to be supported in the operation of device 900.The example of these data comprises for any application program of operation on device 900 or the instruction of method, contact data, telephone book data, message, picture, video etc.Storer 904 can be realized by the volatibility of any type or non-volatile memory device or their combination, as static RAM (SRAM), Electrically Erasable Read Only Memory (EEPROM), Erasable Programmable Read Only Memory EPROM (EPROM), programmable read only memory (PROM), ROM (read-only memory) (ROM), magnetic store, flash memory, disk or CD.
The various assemblies that power supply module 906 is device 900 provide electric power.Power supply module 906 can comprise power-supply management system, one or more power supply, and other and the assembly generating, manage and distribute electric power for device 900 and be associated.
Multimedia groupware 908 is included in the screen providing an output interface between described device 900 and user.In certain embodiments, screen can comprise liquid crystal display (LCD) and touch panel (TP).If screen comprises touch panel, screen may be implemented as touch-screen, to receive the input signal from user.Touch panel comprises one or more touch sensor with the gesture on sensing touch, slip and touch panel.Described touch sensor can the border of not only sensing touch or sliding action, but also detects the duration relevant to described touch or slide and pressure.In certain embodiments, multimedia groupware 908 comprises a front-facing camera and/or post-positioned pick-up head.When device 900 is in operator scheme, during as screening-mode or video mode, front-facing camera and/or post-positioned pick-up head can receive outside multi-medium data.Each front-facing camera and post-positioned pick-up head can be fixing optical lens systems or have focal length and optical zoom ability.
Audio-frequency assembly 910 is configured to export and/or input audio signal.Such as, audio-frequency assembly 910 comprises a microphone (MIC), and when device 900 is in operator scheme, during as call model, logging mode and speech recognition mode, microphone is configured to receive external audio signal.The sound signal received can be stored in storer 904 further or be sent via communications component 916.In certain embodiments, audio-frequency assembly 910 also comprises a loudspeaker, for output audio signal.
I/O interface 912 is for providing interface between processing components 902 and peripheral interface module, and above-mentioned peripheral interface module can be keyboard, some striking wheel, button etc.These buttons can include but not limited to: home button, volume button, start button and locking press button.
Sensor module 914 comprises one or more sensor, for providing the state estimation of various aspects for device 900.Such as, sensor module 914 can detect the opening/closing state of device 900, the relative positioning of assembly, such as described assembly is display and the keypad of device 900, the position of all right pick-up unit 900 of sensor module 914 or device 900 1 assemblies changes, the presence or absence that user contacts with device 900, the temperature variation of device 900 orientation or acceleration/deceleration and device 900.Sensor module 914 can comprise proximity transducer, be configured to without any physical contact time detect near the existence of object.Sensor module 914 can also comprise optical sensor, as CMOS or ccd image sensor, for using in imaging applications.In certain embodiments, this sensor module 914 can also comprise acceleration transducer, gyro sensor, Magnetic Sensor, pressure transducer or temperature sensor.
Communications component 916 is configured to the communication being convenient to wired or wireless mode between device 900 and other equipment.Device 900 can access the wireless network based on communication standard, as WiFi, 2G or 3G, or their combination.In one exemplary embodiment, communications component 916 receives from the broadcast singal of external broadcasting management system or broadcast related information via broadcast channel.In one exemplary embodiment, described communications component 916 also comprises near-field communication (NFC) module, to promote junction service.Such as, can based on radio-frequency (RF) identification (RFID) technology in NFC module, Infrared Data Association (IrDA) technology, ultra broadband (UWB) technology, bluetooth (BT) technology and other technologies realize.
In the exemplary embodiment, device 900 can be realized, for performing said method by one or more application specific integrated circuit (ASIC), digital signal processor (DSP), digital signal processing appts (DSPD), programmable logic device (PLD) (PLD), field programmable gate array (FPGA), controller, microcontroller, microprocessor or other electronic components.
In the exemplary embodiment, additionally provide a kind of non-transitory computer-readable recording medium comprising instruction, such as, comprise the storer 904 of instruction, above-mentioned instruction can perform said method by the processor 920 of device 900.Such as, described non-transitory computer-readable recording medium can be ROM, random access memory (RAM), CD-ROM, tape, floppy disk and optical data storage devices etc.
A kind of non-transitory computer-readable recording medium, when the instruction in described storage medium is performed by the processor of device 900, make device 900 can perform a kind of diet method for supervising, described method comprises:
Whether the dining table top image and the dining table top original image that compare the dining moment be distinct, desktop picture when this dining table top original image is non-placing articles on this dining table;
This dining table top image and dining table top original image distinct time, obtain difference image, this difference image comprises: N number of subimage, and the area of this subimage arbitrary is less than the area of this original image, this N be greater than or equal to 1 integer;
Determine the diet label of each subimage in this N number of subimage;
Send catering information, this catering information comprises the diet label of this N number of subimage.
Optionally, the method also comprises:
Take preset duration as one-period length, periodically gather dining table top image;
When the area that the current dining table top image and upper gathering the moment gathers the difference image of the dining table top image in moment is greater than this preset area threshold value, determine that this current collection moment is this dining moment.
Optionally, the method also comprises:
On this dining table top image gathering the moment with on gather the difference image of the dining table top image in moment area be not more than this preset area threshold value time, determine that this current collections moment is for having dinner start time.
Optionally, the method also comprises:
When the area that the dining table top image and upper gathering the moment in this prior gathers the difference image of the dining table top image in moment is not more than this preset area threshold value, to judge on this that one gathers whether the moment is the dining moment;
When a collection moment is for the dining moment on this, determine that this current collection moment is for dining finish time.
Optionally, the method also comprises:
In this dining start time and this dining finish time, send instruction of weighing respectively, this instruction of weighing is used to indicate food and drink furniture and obtains weight of object on this food and drink furniture, and this weight of object is sent to this terminal, this food and drink furniture comprises at least one in dining table and chair, and this chair is positioned at the predeterminable range scope around this dining table.
Optionally, this instruction of weighing also comprises moment mark, and it is this dining start time or this dining finish time that this moment mark is used to indicate the moment sending instruction of weighing.
Optionally, the method also comprises:
Obtain the Region dividing of the desktop of dining table in this dining table top original image pre-set;
According to this Region dividing, determine each subimage region in this dining table top original image in this N number of subimage;
Obtain the Region dividing of this dining table top original image pre-set and numbering corresponding relation;
According to this corresponding relation, determine the numbering of this each subimage region in this dining table top original image;
Generate this catering information, this catering information at least comprises: the numbering of this each subimage region in this dining table top original image.
Optionally, this catering information also comprises: this N number of subimage, the diet label one_to_one corresponding of this N number of subimage and this N number of subimage.
In sum, a kind of diet supervising device that disclosure embodiment provides, camera can compare the dining dining table top image in moment and dining table top original image whether distinct, and this dining table top image and dining table top original image distinct time, obtain difference image, this difference image comprises: N number of subimage, camera can determine the diet label of each subimage in this N number of subimage further afterwards, and sending catering information to terminal, this catering information comprises the diet label of this N number of subimage.Therefore, user, without the need to manually to terminal input diet record, simplifies the operating process of diet monitoring, improves the efficiency of diet monitoring.
Figure 10 is the block diagram of a kind of diet supervising device 1000 according to an exemplary embodiment.Such as, device 1000 can be mobile phone, computing machine, digital broadcast terminal, messaging devices, game console, tablet device, Medical Devices, body-building equipment, personal digital assistant etc.
With reference to Figure 10, device 1000 can comprise following one or more assembly: processing components 1002, storer 1004, power supply module 1006, multimedia groupware 1008, audio-frequency assembly 1010, the interface 1012 of I/O (I/O), sensor module 1014, and communications component 1016.
The integrated operation of the usual control device 1000 of processing components 1002, such as with display, call, data communication, camera operation and record operate the operation be associated.Processing components 1002 can comprise one or more processor 1020 to perform instruction, to complete all or part of step of above-mentioned method.In addition, processing components 1002 can comprise one or more module, and what be convenient between processing components 1002 and other assemblies is mutual.Such as, processing components 1002 can comprise multi-media module, mutual with what facilitate between multimedia groupware 1008 and processing components 1002.
Storer 1004 is configured to store various types of data to be supported in the operation of device 1000.The example of these data comprises for any application program of operation on device 1000 or the instruction of method, contact data, telephone book data, message, picture, video etc.Storer 1004 can be realized by the volatibility of any type or non-volatile memory device or their combination, as static RAM (SRAM), Electrically Erasable Read Only Memory (EEPROM), Erasable Programmable Read Only Memory EPROM (EPROM), programmable read only memory (PROM), ROM (read-only memory) (ROM), magnetic store, flash memory, disk or CD.
The various assemblies that power supply module 1006 is device 1000 provide electric power.Power supply module 1006 can comprise power-supply management system, one or more power supply, and other and the assembly generating, manage and distribute electric power for device 1000 and be associated.
Multimedia groupware 1008 is included in the screen providing an output interface between described device 1000 and user.In certain embodiments, screen can comprise liquid crystal display (LCD) and touch panel (TP).If screen comprises touch panel, screen may be implemented as touch-screen, to receive the input signal from user.Touch panel comprises one or more touch sensor with the gesture on sensing touch, slip and touch panel.Described touch sensor can the border of not only sensing touch or sliding action, but also detects the duration relevant to described touch or slide and pressure.In certain embodiments, multimedia groupware 1008 comprises a front-facing camera and/or post-positioned pick-up head.When device 1000 is in operator scheme, during as screening-mode or video mode, front-facing camera and/or post-positioned pick-up head can receive outside multi-medium data.Each front-facing camera and post-positioned pick-up head can be fixing optical lens systems or have focal length and optical zoom ability.
Audio-frequency assembly 1010 is configured to export and/or input audio signal.Such as, audio-frequency assembly 1010 comprises a microphone (MIC), and when device 1000 is in operator scheme, during as call model, logging mode and speech recognition mode, microphone is configured to receive external audio signal.The sound signal received can be stored in storer 1004 further or be sent via communications component 1016.In certain embodiments, audio-frequency assembly 1010 also comprises a loudspeaker, for output audio signal.
I/O interface 1012 is for providing interface between processing components 1002 and peripheral interface module, and above-mentioned peripheral interface module can be keyboard, some striking wheel, button etc.These buttons can include but not limited to: home button, volume button, start button and locking press button.
Sensor module 1014 comprises one or more sensor, for providing the state estimation of various aspects for device 1000.Such as, sensor module 1014 can detect the opening/closing state of device 1000, the relative positioning of assembly, such as described assembly is display and the keypad of device 1000, the position of all right pick-up unit 1000 of sensor module 1014 or device 1000 assemblies changes, the presence or absence that user contacts with device 1000, the temperature variation of device 1000 orientation or acceleration/deceleration and device 1000.Sensor module 1014 can comprise proximity transducer, be configured to without any physical contact time detect near the existence of object.Sensor module 1014 can also comprise optical sensor, as CMOS or ccd image sensor, for using in imaging applications.In certain embodiments, this sensor module 1014 can also comprise acceleration transducer, gyro sensor, Magnetic Sensor, pressure transducer or temperature sensor.
Communications component 1016 is configured to the communication being convenient to wired or wireless mode between device 1000 and other equipment.Device 1000 can access the wireless network based on communication standard, as WiFi, 2G or 3G, or their combination.In one exemplary embodiment, communications component 1016 receives from the broadcast singal of external broadcasting management system or broadcast related information via broadcast channel.In one exemplary embodiment, described communications component 1016 also comprises near-field communication (NFC) module, to promote junction service.Such as, can based on radio-frequency (RF) identification (RFID) technology in NFC module, Infrared Data Association (IrDA) technology, ultra broadband (UWB) technology, bluetooth (BT) technology and other technologies realize.
In the exemplary embodiment, device 1000 can be realized, for performing said method by one or more application specific integrated circuit (ASIC), digital signal processor (DSP), digital signal processing appts (DSPD), programmable logic device (PLD) (PLD), field programmable gate array (FPGA), controller, microcontroller, microprocessor or other electronic components.
In the exemplary embodiment, additionally provide a kind of non-transitory computer-readable recording medium comprising instruction, such as, comprise the storer 1004 of instruction, above-mentioned instruction can perform said method by the processor 1020 of device 1000.Such as, described non-transitory computer-readable recording medium can be ROM, random access memory (RAM), CD-ROM, tape, floppy disk and optical data storage devices etc.
A kind of non-transitory computer-readable recording medium, when the instruction in described storage medium is performed by the processor of device 1000, make device 1000 can perform a kind of diet method for supervising, described method comprises:
Reception is weighed instruction, and this instruction of weighing is that camera sent in dining start time or dining finish time.
According to this instruction of weighing, Weighing module is adopted to obtain the weight of the object on current time food and drink furniture.
Send weighing information, this weighing information comprises: the weight of object on this food and drink furniture of current time and the mark of this food and drink furniture.
Optionally, when this food and drink furniture is dining table, the desktop of this dining table is divided into M region, and this Weighing module comprises M sub-Weighing module, and the individual sub-Weighing module of this M respectively one_to_one corresponding is arranged on below this M region,
This is according to this instruction of weighing, and adopts Weighing module to obtain the weight of the object on current time food and drink furniture, comprising:
In this M sub-Weighing module, determine to perceive the sub-Weighing module of target of weight change;
Obtain the weight that the sub-Weighing module of this target each measures.
Optionally, it is characterized in that, the method also comprises:
According to the Region dividing of the desktop of this dining table pre-set, determine the sub-Weighing module region of this target;
Obtain the Region dividing of this dining table top original image pre-set and numbering corresponding relation;
According to this corresponding relation, determine the numbering of the sub-Weighing module region of this target each;
Generate this weighing information, this weighing information at least comprises: the numbering of the sub-Weighing module region of this target each, the weight measured with the sub-Weighing module of this target each.
Optionally, when this food and drink furniture is chair,
This is according to this instruction of weighing, and adopts Weighing module to obtain the weight of food and drink furniture current time placement object, comprising:
According to this instruction of weighing, the Weighing module below the seat being arranged on this chair is adopted to obtain the weight of the human body that this chair current time bears.
Optionally, this instruction of weighing also comprises moment mark, and it is this dining start time or this dining finish time that this moment mark is used to indicate the moment sending instruction of weighing;
This weighing information also comprises: this moment identifies.
In sum, a kind of diet supervising device that disclosure embodiment provides, after terminal receives the catering information of camera transmission, can show this catering information, this catering information comprises: the diet label of N number of subimage.Terminal can show diet record intuitively, improves the intuitive of diet monitoring.
Disclosure embodiment provides a kind of diet supervisory system, and this system comprises: camera and terminal,
This camera comprises the diet supervising device shown in Fig. 6-1 or Fig. 6-2;
This terminal comprises the diet supervising device shown in Fig. 8-1 or Fig. 8-2.
Optionally, this system also comprises: the food and drink furniture shown in Fig. 7-1 to Fig. 7-3 is arbitrary.
Disclosure embodiment provides another kind of diet supervisory system, and this system comprises: camera and terminal,
This camera comprises the diet supervising device shown in Fig. 9;
This terminal comprises the diet supervising device shown in Figure 10.
Optionally, this system also comprises: the food and drink furniture shown in Fig. 7-1 to Fig. 7-3 is arbitrary.
Those skilled in the art, at consideration instructions and after putting into practice invention disclosed herein, will easily expect other embodiment of the present disclosure.The application is intended to contain any modification of the present disclosure, purposes or adaptations, and these modification, purposes or adaptations are followed general principle of the present disclosure and comprised the undocumented common practise in the art of the disclosure or conventional techniques means.Instructions and embodiment are only regarded as exemplary, and true scope of the present disclosure and spirit are pointed out by claim below.
Should be understood that, the disclosure is not limited to precision architecture described above and illustrated in the accompanying drawings, and can carry out various amendment and change not departing from its scope.The scope of the present disclosure is only limited by appended claim.

Claims (42)

1. a diet method for supervising, is characterized in that, described method comprises:
Whether the dining table top image and the dining table top original image that compare the dining moment be distinct, desktop picture when described dining table top original image is non-placing articles on described dining table;
Described dining table top image and dining table top original image distinct time, obtain difference image, described difference image comprises: N number of subimage, and the area of arbitrary described subimage is less than the area of described original image, described N be greater than or equal to 1 integer;
Determine the diet label of each subimage in described N number of subimage;
Send catering information, described catering information comprises the diet label of described N number of subimage.
2. method according to claim 1, is characterized in that, described method also comprises:
Take preset duration as one-period length, periodically gather dining table top image;
When the area that the current dining table top image and upper gathering the moment gathers the difference image of the dining table top image in moment is greater than preset area threshold value, determine that the described current collection moment is the described dining moment.
3. method according to claim 2, is characterized in that, described method also comprises:
When described upper one dining table top image and the upper area gathering the difference image of the dining table top image in moment gathering the moment is not more than described preset area threshold value, determine that the described current collection moment is start time of having dinner.
4. method according to claim 2, is characterized in that, described method also comprises:
When the area that the described current dining table top image and upper gathering the moment gathers the difference image of the dining table top image in moment is not more than described preset area threshold value, judge that described upper one gathers whether the moment is the dining moment;
On described one gather the moment for the dining moment time, determine that described current the collections moment is for having dinner finish time.
5. method according to claim 4, is characterized in that, described method also comprises:
In described dining start time and described dining finish time, send instruction of weighing respectively, described instruction of weighing is used to indicate food and drink furniture and obtains weight of object on described food and drink furniture, and described weight of object is sent to described terminal, described food and drink furniture comprises at least one in dining table and chair, and described chair is positioned at the predeterminable range scope around described dining table.
6. method according to claim 5, is characterized in that, described in instruction of weighing also comprise moment mark, it is described dining start time or described dining finish time that described moment mark is used to indicate the moment sending instruction of weighing.
7. method according to claim 1, is characterized in that, described method also comprises:
Obtain the Region dividing of the desktop of dining table in the described dining table top original image pre-set;
According to described Region dividing, determine each subimage region in described dining table top original image in described N number of subimage;
Obtain the Region dividing of the described dining table top original image pre-set and numbering corresponding relation;
According to described corresponding relation, determine the numbering of described each subimage region in described dining table top original image;
Generate described catering information, described catering information at least comprises: the numbering of described each subimage region in described dining table top original image.
8. method according to claim 1, is characterized in that, described catering information also comprises: described N number of subimage, the diet label one_to_one corresponding of described N number of subimage and described N number of subimage.
9. a diet method for supervising, is characterized in that, described method comprises:
Reception is weighed instruction, described in instruction of weighing sent in dining start time or dining finish time by camera;
According to described instruction of weighing, Weighing module is adopted to obtain the weight of the object on current time food and drink furniture;
Send weighing information, described weighing information comprises: the weight of the object on food and drink furniture described in current time and the mark of described food and drink furniture.
10. method according to claim 9, is characterized in that, when described food and drink furniture is dining table, the desktop of described dining table is divided into M region, described Weighing module comprises M sub-Weighing module, and the individual sub-Weighing module of described M respectively one_to_one corresponding is arranged on below a described M region
To weigh described in described basis instruction, adopt Weighing module to obtain the weight of the object on current time food and drink furniture, comprising:
In described M sub-Weighing module, determine to perceive the sub-Weighing module of target of weight change;
Obtain the weight that the sub-Weighing module of each described target measures.
11. methods according to claim 10, is characterized in that, described method also comprises:
According to the Region dividing of the desktop of the described dining table pre-set, determine the sub-Weighing module region of described target;
Obtain the Region dividing of the described dining table top original image pre-set and numbering corresponding relation;
According to described corresponding relation, determine the numbering of the sub-Weighing module region of each described target;
Generate described weighing information, described weighing information at least comprises: the numbering of the sub-Weighing module region of each described target, the weight measured with the sub-Weighing module of each described target.
12. methods according to claim 9, is characterized in that, when described food and drink furniture is chair,
Weigh described in described basis instruction, adopt Weighing module to obtain weight that food and drink furniture current time places object, comprising:
According to described instruction of weighing, the Weighing module below the seat being arranged on described chair is adopted to obtain the weight of the human body that described chair current time bears.
13. methods according to claim 9, is characterized in that, described in instruction of weighing also comprise moment mark, it is described dining start time or described dining finish time that described moment mark is used to indicate the moment sending instruction of weighing;
Described weighing information also comprises: described moment mark.
14. 1 kinds of diet method for supervising, is characterized in that, described method comprises:
Receive catering information, described catering information comprises the diet label of N number of subimage;
Show described catering information.
15. methods according to claim 14, is characterized in that, described method also comprises:
Receive weighing information, described weighing information comprises: the weight of the object on food and drink furniture described in current time and the mark of described food and drink furniture, and described weight is that described food and drink furniture adopts Weighing module to obtain;
Show described weighing information.
16. methods according to claim 14, is characterized in that, described catering information also comprises:
Described N number of subimage, the diet label one_to_one corresponding of described N number of subimage and described N number of subimage;
The described catering information of described display, comprising:
The diet label one_to_one corresponding of described N number of subimage and described N number of subimage is shown.
17. methods according to claim 15, it is characterized in that, described food and drink furniture comprises at least one in dining table and chair, when described food and drink furniture is dining table, the desktop of described dining table is divided into M region, described Weighing module comprises M sub-Weighing module, and the individual sub-Weighing module of described M respectively one_to_one corresponding is arranged on below a described M region
Described catering information at least comprises: the numbering of each subimage region in described dining table top original image in described N number of subimage, image when described dining table top original image is non-placing articles on described dining table, described weighing information at least comprises: the numbering of the sub-Weighing module region of each target, the weight measured with the sub-Weighing module of each described target, the sub-Weighing module of described target is the sub-Weighing module perceiving weight change in described M sub-Weighing module;
The described weighing information of described display, comprising:
The numbering of described each target sub-Weighing module region is mated with the numbering of each subimage region in described dining table top original image in described N number of subimage, determines the target designation that the numbering of region in described dining table top original image in described N number of subimage is identical with the numbering of described target sub-Weighing module region;
The weight that association subimage corresponding to the described target designation of display and the sub-Weighing module of target corresponding to described target designation measure.
18. methods according to claim 15, it is characterized in that, described weighing information also comprises: the moment identifies, and described moment mark is used to indicate camera and sends the moment of instruction of weighing to described food and drink furniture for dining start time or dining finish time, and described method also comprises:
In the weighing information received, the weight of the object on food and drink furniture described in the current time that the weighing information being start time of having dinner by the moment of moment mark instruction comprises is defined as the first weight;
In the weighing information received, the weight of the object on food and drink furniture described in the current time that the weighing information being finish time of having dinner by the moment of moment mark instruction comprises is defined as the second weight;
The difference of described first weight and described second weight is defined as intake;
According to described intake, generate dietary recommendation information;
Show described dietary recommendation information.
19. methods according to claim 17, it is characterized in that, described weighing information also comprises: the moment identifies, and described moment mark is used to indicate camera and sends the moment of instruction of weighing to described food and drink furniture for dining start time or dining finish time, and described method also comprises:
In the weighing information received, the weight of the object on food and drink furniture described in the current time that the weighing information being start time of having dinner by the moment of moment mark instruction comprises is defined as the first weight;
In the weighing information received, the weight of the object on food and drink furniture described in the current time that the weighing information being finish time of having dinner by the moment of moment mark instruction comprises is defined as the second weight;
Obtain in described first weight, the first sub-weight that the sub-Weighing module of the target that described target designation is corresponding measures;
Obtain in described second weight, the second sub-weight that the sub-Weighing module of the target that described target designation is corresponding measures;
The difference of described first sub-weight and described second sub-weight is defined as target intake;
According to the diet label of the described target intake subimage corresponding with described target designation, generate dietary recommendation information;
Show described dietary recommendation information.
20. 1 kinds of diet supervising devices, is characterized in that, described device comprises:
Comparison module, whether the dining table top image and the dining table top original image that are configured to compare the dining moment be distinct, desktop picture when described dining table top original image is non-placing articles on described dining table;
First acquisition module, be configured to described dining table top image and dining table top original image distinct time, obtain difference image, described difference image comprises: N number of subimage, the area of arbitrary described subimage is less than the area of described original image, described N be greater than or equal to 1 integer;
First determination module, is configured to the diet label determining each subimage in described N number of subimage;
First sending module, be configured to send catering information, described catering information comprises the diet label of described N number of subimage.
21. devices according to claim 20, is characterized in that, described device also comprises:
Acquisition module, is configured to take preset duration as one-period length, periodically gathers dining table top image;
Second determination module, is configured to, when the area that the dining table top image and upper in current collection moment gathers the difference image of the dining table top image in moment is greater than preset area threshold value, determine that the described current collection moment is the described dining moment.
22. devices according to claim 21, is characterized in that, described device also comprises:
3rd determination module, is configured to, when described upper one dining table top image and the upper area gathering the difference image of the dining table top image in moment gathering the moment is not more than described preset area threshold value, determine that the described current collection moment is start time of having dinner.
23. devices according to claim 21, is characterized in that, described device also comprises:
First judge module, is configured to when the area that the dining table top image and upper in described current collection moment gathers the difference image of the dining table top image in moment is not more than described preset area threshold value, judges that described upper one gathers whether the moment is the dining moment;
4th determination module, to be configured on described one gather the moment for the dining moment time, determine that described current the collections moment is for having dinner finish time.
24. devices according to claim 23, is characterized in that, described device also comprises:
Second sending module, be configured in described dining start time and described dining finish time, send instruction of weighing respectively, described instruction of weighing is used to indicate food and drink furniture and obtains weight of object on described food and drink furniture, and described weight of object is sent to described terminal, described food and drink furniture comprises at least one in dining table and chair, and described chair is positioned at the predeterminable range scope around described dining table.
25. devices according to claim 24, is characterized in that, described in instruction of weighing also comprise moment mark, it is described dining start time or described dining finish time that described moment mark is used to indicate the moment sending instruction of weighing.
26. devices according to claim 20, is characterized in that, described device also comprises:
Second acquisition module, is configured to obtain the Region dividing of the desktop of dining table in the described dining table top original image pre-set;
5th determination module, is configured to according to described Region dividing, determines each subimage region in described dining table top original image in described N number of subimage;
3rd acquisition module, is configured to obtain the Region dividing of the described dining table top original image pre-set and numbering corresponding relation;
6th determination module, is configured to according to described corresponding relation, determines the numbering of described each subimage region in described dining table top original image;
Generation module, be configured to generate described catering information, described catering information at least comprises: the numbering of described each subimage region in described dining table top original image.
27. devices according to claim 20, is characterized in that, described catering information also comprises: described N number of subimage, the diet label one_to_one corresponding of described N number of subimage and described N number of subimage.
28. 1 kinds of food and drink furniture, is characterized in that, described food and drink furniture comprises at least one in dining table and chair,
Weighing module and communication module is provided with in described food and drink furniture;
Described Weighing module is connected with described communication module.
29. food and drink furniture according to claim 28, it is characterized in that, when described food and drink furniture is dining table, the desktop of described dining table is divided into M region, described Weighing module comprises M sub-Weighing module, and the individual sub-Weighing module of described M respectively one_to_one corresponding is arranged on below a described M region;
Described M sub-Weighing module is connected with described communication module respectively.
30. food and drink furniture according to claim 28, is characterized in that, when described food and drink furniture is chair, described Weighing module is arranged on the below of described seat.
31. 1 kinds of diet supervising devices, is characterized in that, described device comprises:
First receiver module, be configured to receive catering information, described catering information comprises the diet label of N number of subimage;
First display module, is configured to show described catering information.
32. devices according to claim 31, is characterized in that, described device also comprises:
Second receiver module, be configured to receive weighing information, described weighing information comprises: the weight of the object on food and drink furniture described in current time and the mark of described food and drink furniture, and described weight is that described food and drink furniture adopts Weighing module to obtain;
Second display module, is configured to show described weighing information.
33. devices according to claim 31, is characterized in that, described catering information also comprises:
Described N number of subimage, the diet label one_to_one corresponding of described N number of subimage and described N number of subimage;
Described first display module, is configured to:
The diet label one_to_one corresponding of described N number of subimage and described N number of subimage is shown.
34. devices according to claim 32, it is characterized in that, described food and drink furniture comprises at least one in dining table and chair, when described food and drink furniture is dining table, the desktop of described dining table is divided into M region, described Weighing module comprises M sub-Weighing module, and the individual sub-Weighing module of described M respectively one_to_one corresponding is arranged on below a described M region
Described catering information at least comprises: the numbering of each subimage region in described dining table top original image in described N number of subimage, image when described dining table top original image is non-placing articles on described dining table, described weighing information at least comprises: the numbering of the sub-Weighing module region of each target, the weight measured with the sub-Weighing module of each described target, the sub-Weighing module of described target is the sub-Weighing module perceiving weight change in described M sub-Weighing module;
Described second display module, is configured to:
The numbering of described each target sub-Weighing module region is mated with the numbering of each subimage region in described dining table top original image in described N number of subimage, determines the target designation that the numbering of region in described dining table top original image in described N number of subimage is identical with the numbering of described target sub-Weighing module region;
The weight that association subimage corresponding to the described target designation of display and the sub-Weighing module of target corresponding to described target designation measure.
35. devices according to claim 32, it is characterized in that, described weighing information also comprises: the moment identifies, and described moment mark is used to indicate camera and sends the moment of instruction of weighing to described food and drink furniture for dining start time or dining finish time, and described device also comprises:
First determination module, is configured in the weighing information received, and the weight of the object on food and drink furniture described in the current time that the weighing information being start time of having dinner by the moment of moment mark instruction comprises is defined as the first weight;
Second determination module, is configured in the weighing information received, and the weight of the object on food and drink furniture described in the current time that the weighing information being finish time of having dinner by the moment of moment mark instruction comprises is defined as the second weight;
3rd determination module, is configured to the difference of described first weight and described second weight to be defined as intake;
First generation module, is configured to according to described intake, generates dietary recommendation information;
3rd display module, is configured to show described dietary recommendation information.
36. devices according to claim 34, it is characterized in that, described weighing information also comprises: the moment identifies, and described moment mark is used to indicate camera and sends the moment of instruction of weighing to described food and drink furniture for dining start time or dining finish time, and described device also comprises:
4th determination module, is configured in the weighing information received, and the weight of the object on food and drink furniture described in the current time that the weighing information being start time of having dinner by the moment of moment mark instruction comprises is defined as the first weight;
5th determination module, is configured in the weighing information received, and the weight of the object on food and drink furniture described in the current time that the weighing information being finish time of having dinner by the moment of moment mark instruction comprises is defined as the second weight;
First acquisition module, is configured to obtain in described first weight, the first sub-weight that the sub-Weighing module of the target that described target designation is corresponding measures;
Second acquisition module, is configured to obtain in described second weight, the second sub-weight that the sub-Weighing module of the target that described target designation is corresponding measures;
6th determination module, is configured to the difference of described first sub-weight and described second sub-weight to be defined as target intake;
Second generation module, is configured to the diet label according to the described target intake subimage corresponding with described target designation, generates dietary recommendation information;
4th display module, is configured to show described dietary recommendation information.
37. 1 kinds of diet supervising devices, is characterized in that, described device comprises:
Processor;
For storing the storer of the executable instruction of described processor;
Wherein, described processor is configured to:
Whether the dining table top image and the dining table top original image that compare the dining moment be distinct, image when described dining table top original image is non-placing articles on described dining table;
Described dining table top image and dining table top original image distinct time, obtain difference image, described difference image comprises: N number of subimage, and the area of arbitrary described subimage is less than the area of described original image, described N be greater than or equal to 1 integer;
Determine the diet label of each subimage in described N number of subimage;
Send catering information, described catering information comprises the diet label of described N number of subimage.
38. 1 kinds of diet supervising devices, is characterized in that, described device comprises:
Processor;
For storing the storer of the executable instruction of described processor;
Wherein, described processor is configured to:
Receive catering information, described catering information comprises the diet label of N number of subimage;
Show described catering information.
39. 1 kinds of diet supervisory systems, is characterized in that, described system comprises: camera and terminal,
Described camera comprises the arbitrary described diet supervising device of claim 20 to 27;
Described terminal comprises the arbitrary described diet supervising device of claim 31 to 36.
40., according to system according to claim 39, is characterized in that, described system also comprises:
The arbitrary described food and drink furniture of claim 28 to 30.
41. 1 kinds of diet supervisory systems, is characterized in that, described system comprises: camera and terminal,
Described camera comprises diet supervising device according to claim 37;
Described terminal comprises diet supervising device according to claim 38.
42. systems according to claim 41, is characterized in that, described system also comprises:
The arbitrary described food and drink furniture of claim 28 to 30.
CN201510612477.XA 2015-09-23 2015-09-23 Diet monitoring method, device, system and food and drink furniture Active CN105243270B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510612477.XA CN105243270B (en) 2015-09-23 2015-09-23 Diet monitoring method, device, system and food and drink furniture

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510612477.XA CN105243270B (en) 2015-09-23 2015-09-23 Diet monitoring method, device, system and food and drink furniture

Publications (2)

Publication Number Publication Date
CN105243270A true CN105243270A (en) 2016-01-13
CN105243270B CN105243270B (en) 2019-05-21

Family

ID=55040918

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510612477.XA Active CN105243270B (en) 2015-09-23 2015-09-23 Diet monitoring method, device, system and food and drink furniture

Country Status (1)

Country Link
CN (1) CN105243270B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106235614A (en) * 2016-08-28 2016-12-21 潘昌仁 A kind of desk of ponderable quantity food weight
CN106235639A (en) * 2016-08-28 2016-12-21 潘昌仁 A kind of intelligent weighing desk
CN106235638A (en) * 2016-08-28 2016-12-21 潘昌仁 A kind of intelligent desk weighed
CN106257478A (en) * 2016-08-02 2016-12-28 昆明理工大学 A kind of dining is taken in caloric computational methods and device
CN106308047A (en) * 2016-08-28 2017-01-11 潘昌仁 Weighing type table
CN106724024A (en) * 2017-02-25 2017-05-31 张�浩 Possesses the multifunctional intellectual dining table of trophic analysis function
CN107703091A (en) * 2017-08-31 2018-02-16 维沃移动通信有限公司 A kind of generation method and mobile terminal for cooking advisory information
CN110008829A (en) * 2019-02-21 2019-07-12 秒针信息技术有限公司 The method of adjustment and device of food
CN111797756A (en) * 2020-06-30 2020-10-20 平安国际智慧城市科技股份有限公司 Video analysis method, device and medium based on artificial intelligence
CN112925257A (en) * 2021-01-25 2021-06-08 南京麦澜德医疗科技股份有限公司 Intelligent diet monitoring device and method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030208113A1 (en) * 2001-07-18 2003-11-06 Mault James R Closed loop glycemic index system
CN101326526A (en) * 2005-12-15 2008-12-17 皇家飞利浦电子股份有限公司 Modifying a person's eating and activity habits
CN102214269A (en) * 2010-04-06 2011-10-12 索尼公司 Information processing apparatus, information outputting method and computer program storage device
CN103942569A (en) * 2014-04-16 2014-07-23 中国计量学院 Chinese style dish recognition device based on computer vision
CN104376203A (en) * 2014-11-07 2015-02-25 汪毅 Cloud physique management system and management method by regarding sports and diet intervention as core
CN104433395A (en) * 2014-12-19 2015-03-25 胡明建 Design method of intelligent chair capable of monitoring health
CN104720306A (en) * 2013-12-20 2015-06-24 陕西蜂翼智能科技有限公司 Dining table with weighing function
CN104778374A (en) * 2015-05-04 2015-07-15 哈尔滨理工大学 Automatic dietary estimation device based on image processing and recognizing method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030208113A1 (en) * 2001-07-18 2003-11-06 Mault James R Closed loop glycemic index system
CN101326526A (en) * 2005-12-15 2008-12-17 皇家飞利浦电子股份有限公司 Modifying a person's eating and activity habits
CN102214269A (en) * 2010-04-06 2011-10-12 索尼公司 Information processing apparatus, information outputting method and computer program storage device
CN104720306A (en) * 2013-12-20 2015-06-24 陕西蜂翼智能科技有限公司 Dining table with weighing function
CN103942569A (en) * 2014-04-16 2014-07-23 中国计量学院 Chinese style dish recognition device based on computer vision
CN104376203A (en) * 2014-11-07 2015-02-25 汪毅 Cloud physique management system and management method by regarding sports and diet intervention as core
CN104433395A (en) * 2014-12-19 2015-03-25 胡明建 Design method of intelligent chair capable of monitoring health
CN104778374A (en) * 2015-05-04 2015-07-15 哈尔滨理工大学 Automatic dietary estimation device based on image processing and recognizing method

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106257478A (en) * 2016-08-02 2016-12-28 昆明理工大学 A kind of dining is taken in caloric computational methods and device
CN106257478B (en) * 2016-08-02 2019-06-11 昆明理工大学 The calculation method and device of taken in calorie in a kind of dining
CN106235614A (en) * 2016-08-28 2016-12-21 潘昌仁 A kind of desk of ponderable quantity food weight
CN106235639A (en) * 2016-08-28 2016-12-21 潘昌仁 A kind of intelligent weighing desk
CN106235638A (en) * 2016-08-28 2016-12-21 潘昌仁 A kind of intelligent desk weighed
CN106308047A (en) * 2016-08-28 2017-01-11 潘昌仁 Weighing type table
CN106724024A (en) * 2017-02-25 2017-05-31 张�浩 Possesses the multifunctional intellectual dining table of trophic analysis function
CN107703091A (en) * 2017-08-31 2018-02-16 维沃移动通信有限公司 A kind of generation method and mobile terminal for cooking advisory information
CN110008829A (en) * 2019-02-21 2019-07-12 秒针信息技术有限公司 The method of adjustment and device of food
CN111797756A (en) * 2020-06-30 2020-10-20 平安国际智慧城市科技股份有限公司 Video analysis method, device and medium based on artificial intelligence
CN112925257A (en) * 2021-01-25 2021-06-08 南京麦澜德医疗科技股份有限公司 Intelligent diet monitoring device and method

Also Published As

Publication number Publication date
CN105243270B (en) 2019-05-21

Similar Documents

Publication Publication Date Title
CN105243270A (en) Diet monitoring method, apparatus and system and catering furniture
CN105278378B (en) Information cuing method and device
KR102003149B1 (en) Information generating method and device
CN203940916U (en) A kind of intelligent kitchen weighing device and Weighing system based on wireless communication technology
US9165398B2 (en) Analysis of food items captured in digital images
CN104850432B (en) Adjust the method and device of color
CN105741211B (en) Diet data processing method, device and equipment
CN105511277A (en) Method and device for recommending drink
CN109243579A (en) Processing method, system, storage medium and the terminal of prepared food nutrition data
CN109300526A (en) A kind of recommended method and mobile terminal
CN105160568A (en) Reminding method and apparatus
US20210366033A1 (en) Information processing apparatus, information processing method, and computer program
CN112037087B (en) Catering health safety intelligent monitoring management system based on big data
CN107464158A (en) menu generating method, device and equipment
CN110914619B (en) Food management system
CN107438384A (en) A kind of Intelligent cup and detection method
EP3103106A1 (en) Method of operating a control system and control system therefore
CN108876532A (en) A kind of information processing method and device, electronic equipment, readable storage medium storing program for executing
CN105138591A (en) Method and device for controlling intelligent equipment to prepare food
JP2014123214A (en) Electronic apparatus
Lebrun et al. Management of distributed RFID surfaces: a cooking assistant for ambient computing in kitchen
JP6277582B2 (en) Electronics
JP2018092640A (en) Electronic apparatus
JP2014123215A (en) Electronic apparatus
CN106355521A (en) Diet allergy prevention system and method for applying same

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant