US20170186337A1 - Programming learning center - Google Patents
Programming learning center Download PDFInfo
- Publication number
- US20170186337A1 US20170186337A1 US15/457,540 US201715457540A US2017186337A1 US 20170186337 A1 US20170186337 A1 US 20170186337A1 US 201715457540 A US201715457540 A US 201715457540A US 2017186337 A1 US2017186337 A1 US 2017186337A1
- Authority
- US
- United States
- Prior art keywords
- user
- steps
- computer program
- lesson
- ordered list
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B19/00—Teaching not covered by other main groups of this subclass
- G09B19/0053—Computers, e.g. programming
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B7/00—Electrically-operated teaching apparatus or devices working with questions and answers
- G09B7/02—Electrically-operated teaching apparatus or devices working with questions and answers of the type wherein the student is expected to construct an answer to the question which is presented or wherein the machine gives an answer to the question presented by a student
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B5/00—Electrically-operated educational appliances
- G09B5/06—Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
- G09B5/062—Combinations of audio and printed presentations, e.g. magnetically striped cards, talking books, magnetic tapes with printed texts thereon
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B5/00—Electrically-operated educational appliances
- G09B5/08—Electrically-operated educational appliances providing for individual presentation of information to a plurality of student stations
- G09B5/12—Electrically-operated educational appliances providing for individual presentation of information to a plurality of student stations different stations being capable of presenting different information simultaneously
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Entrepreneurship & Innovation (AREA)
- Computer Hardware Design (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- User Interface Of Digital Computer (AREA)
- Electrically Operated Instructional Devices (AREA)
Abstract
A programming learning center includes a learning center workshop and a learning module generator. The learning center workshop allows a user to create a computing program by connecting programming blocks portrayed visually within the learning center workshop as building blocks. The learning module generator generates a learning module from the computing program. The learning module iterates through the computing program to generate a sequential list of steps. The learning module generator allows the user to add notations to the sequential list of steps and to reorder steps in the sequential list of steps
Description
- The present application is a continuation application of U.S. patent application Ser. No. 13/715,417, filed Dec. 14, 2012, issued on Mar. 14, 2017 as U.S. Pat. No. 9,595,202 and entitled “PROGRAMMING LEARNING CENTER”, the entire disclosure of which application is hereby incorporated herein by reference.
- Computers are ubiquitous and used for business, education, recreation and so on. Familiarity with the principles of computer programming and experience in computer programming is a useful skill. While familiarity with commonly used programming languages may be beyond the competency of many younger children, even at a young age children can learn the basic principles of computer programming.
-
FIG. 1 is a simplified block diagram of a learning center that teaches basic programming principles in accordance with an implementation. -
FIG. 2 is a simplified block diagram of a learning center server of the learning center shown inFIG. 1 in accordance with an implementation. -
FIG. 3 is a simplified diagram illustrating operation of a graphics user interface used to program a project that runs within a learning center runtime for the learning center shown inFIG. 1 in accordance with an implementation. -
FIG. 4 is a simplified block of a learning center runtime for the learning center shown inFIG. 1 in accordance with an implementation. -
FIG. 5 is a simplified flowchart that describes publication to runtime of a project designed to run within a learning center runtime for the learning center shown inFIG. 1 in accordance with an implementation. -
FIG. 6 andFIG. 7 are simplified diagrams illustrating operation of a graphics user interface used to generate lesson modules for a project designed to run within a learning center runtime in accordance with an implementation. -
FIG. 8 andFIG. 9 are simplified flowcharts that describe generation of lesson modules for a project designed to run within a learning center runtime in accordance with an implementation. -
FIG. 10 is a simplified block diagram of a lesson runner architecture that runs lesson modules within a learning center runtime in accordance with an implementation. -
FIG. 11 ,FIG. 12 andFIG. 13 are simplified diagrams illustrating operation of a graphics user interface used to run lesson modules within a learning center runtime in accordance with an implementation. -
FIG. 14 shows a success criteria block and a failure criteria block as displayed by a graphics user interface used to create lesson modules within a learning center runtime in accordance with an implementation. -
FIG. 15 andFIG. 16 are simplified block diagrams illustrating validation of a project within a learning center runtime in accordance with an implementation. -
FIG. 17 ,FIG. 18 andFIG. 19 are simplified diagrams illustrating operation of a graphics user interface providing runtime feedback to a user within a lesson running in a puzzle mode within a learning center runtime in accordance with an implementation. -
FIG. 20 is a block diagram illustrating a project runtime in accordance with an implementation. -
FIG. 1 is a simplified block diagram of a learning center that includes alearning center server 12 that can be accessed through the Internet 11 by multiple learning center clients. Multiple learning center clients are represented inFIG. 1 by alearning center client 13 and alearning center client 14. - Learning center client may be implemented, for example, on a personal or laptop computer, a tablet computer, a smart phone, or any other computing device capable of accessing
learning center server 12 through theinternet 11. Aninterface 15 withinlearning center client 13 is, for example a web browser or specialty software that allows a user to interact withlearning center 12 throughinternet 11. Likewise, aninterface 16 withinlearning center client 14 is, for example a web browser or specialty software, such as an app, that allows a user to interact withlearning center 12 throughinternet 11. - For example, the learning center integrates social learning and unique game mechanics with a guided curriculum to deliver a highly engaging and rewarding experience to children of all ages. The learning center allows children to perform creative activities such as write digital storybooks and scrapbooks, build video games, animate their favorite characters and share these with friends and family.
-
FIG. 2 is a simplified block diagram showing thatlearning center server 12 includes alearning center workshop 21, alearning center runtime 22, alearning module generator 23 and alearning module engine 24. -
Learning center workshop 21 allows a user on a learning center client to build learning center programs visually using the interface for the learning center client.Learning center runtime 22 allows a user on a learning center client to run learning center programs. -
Learning module generator 23 allows a user on a learning center client to generate learning modules from learning center programs.Learning module engine 24 allows a user on the learning center client to run learning modules and guides the user to build a learning center program. The learning module engine validates all known triggers and keeps parameters within a known range. - Table 1 below, sets out an example of a language specification for the learning center.
-
TABLE 1 <scripts> ::= <script> | <script> <scripts> <script> ::= <entry-block> <blocks> <blocks> ::= {empty} | <exit-block> | <inline-block> <blocks> <inline-block> := <container-block> | <block> <entry-block> ::= <label> <exit-block> ::= <label> <block> ::= <label> <container-block> ::= <label> <containers> <containers> ::= <container> | <container> <containers> <container> ::= “{“ <blocks> “}” <label> ::= {string} <params> <params> ::= {empty} | <param> <params> <param> ::= {string} | {number} | {boolean} | <label-block> <label-block> ::= {string} <params> - Table 2 below, sets out an example of language blocks for the learning center.
-
TABLE 2 CONTROL on start when {choice:keys} pressed when actor clicked broadcast {events} broadcast {events} and wait send message {events} to {actor} with {string} send message {events} to {actor} with {string} and wait received value received source when I receive {events} clone startup wait {number:1} secs forever repeat {number:10} create clone of {actor} name of last cloned actor delete this clone forever if {boolean} if {boolean} if {boolean} else wait until {boolean} repeat until {boolean} stop {choice:stop} stop script stop all MOTION move {number:10} steps turn CW {angle:15} degrees turn CCW {angle:15} degrees point in direction {angle:90} degrees point towards {where} go to x: {number:0} y: {number:0}' blockMotionGoTowards glide {number:1} secs to x: {number:0} y: {number:0} change x by {number:10}' set x to {number:0} change y by {number:10}' set y to {number:0} if on edge, bounce x position y position LOOKS switch to costume {costumes} next costume costume # set label to {string:Hello} say {string:Hello} for {number:2} secs say {string:Hello} think {string:Hmm} for {number:2} secs think {string:Hmm} change {choice:effect} effect by {number:25} set {choice:effect} effect to {number:0} clear graphic effects change size by {number:10} set size to {number:100}% size # show hide go to front go to back go back {number:1} layers go forward {number:1} layers switch to background {costumes} next background background # SENSING touching {where}? touching color {color}? color {color} is touching {color}? ask {string:What is your name} and wait answer mouse x mouse y mouse down? key {choice:keys} pressed? distance to {whereall}? reset timer timer {choice:spriteprop} of {whereall} name of actor {number} # of actors loudness loud? sensor {string:button pressed}? sensor {string:button pressed}? {choice:datetime} of date/time screen left screen right screen top screen bottom SOUND play sound {sounds} play sound {sounds} until done stop all sounds play drum {choice:instrument} for {number:0.2} beats rest for {number:0.2} beats play note {number:60} for {number:0.5} beats set instrument to {choice:instrument} change volume by {number:-10} set volume to {number:100}% volume change tempo by {number:20} set tempo to {number:60} bpm tempo OPERATOR {number} + {number} {number} − {number} {number} * {number} {number} / {number} pick random {number:1} to {number:10} {string} < {string} {string} = {string} {string} > {string} {boolean} and {boolean} {boolean} or {boolean} not {boolean} join {string:hello} {string:world} letter {number:1} of {any:world} length of {string:world} {number} mod {number} round {number} {choice:math} of {number:10} {choice:constants} PEN clear pen up set pen color to {color} change pen color by {number:10} set pen color to {number:0} change pen shade by {number:10} set pen shade to {number:50} change pen size by {number:1} set pen size to {number:1} stamp set font to {choice:fontstyle}{choice:fontsize} {choice:font} draw text {string} when drawing actor redraw actor set fill color to {color} no fill draw bezier at x1:{number:0} y1:{number:0} to x2: {number:0} y2: {number:0} with control points cx1:{number:0} cy1: {number:0} and cx2: {number:0} cy2:{number:0}’ draw point at x:{number:0} y:{number:0} draw line from x1:{number:0} y1:{number:0} to x2: {number:0} y2: {number:0} draw rectangle at x:{number:0} y:{number:0} with width: {number:0} height:{number:0} draw triangle with points x1:{number:0} y1:{number:0} x2:{number:0} y2:{number:0} x3:{number:0} y3:{number:0} draw ellipse at x:{number:0} y:{number:0} with width: {number:0} height:{number:0} draw text {string} at x:{number:0} y:{number:0} draw rectangle with width:{number:0} height:{number0}’ draw ellipse with width:{number:0} height:{number:0} PHYSICS when actor collides collided with {where}? apply force {number:0} apply impulse {number:0} apply force {number:0} at {number:0} degrees apply impulse {number:0} at {number:0} degrees apply horizontal {number:0} and vertical {number:0} force apply horizontal {number:0} and vertical {number:0} impulse apply torque {number:0} set static {boolean} set shape to {choice:geometry} set density to {number:10} set friction to {number:0.5} set restitution to {number:0.2} set angular damping to {number:0} set angular velocity to {number:0} set linear damping to {number:0} set linear velocity to {number:0} by {number:0} density friction restitution angular velocity liertia is awake? x linear velocity y linear velocity set gravity to {number:0} by {number:10} start physics stop physics VARIABLES set {properties} of {actor} to {string:0} property {properties} of {actor} set {variables} to {string:0} change {variables} by {number:1} show variable {variables} hide variable {variables} LISTS add {string:thing} to {lists} delete {choice:lastall} of {lists} insert {string:thing} at {choice:lastany} of {lists} replace item {choice:lastany} of {lists} with {string:thing} item {choice:lastany} of {lists} length of {lists} {lists} contains {string:thing} - A user from a learning center client accesses
learning center workshop 21 through an interface. For example, the interface is a web browser or a dedicated app located on a computing device such as a personal computer or a tablet. When learning is launched, a user can build a project, which is essentially a computer program.Learning center workshop 21 allows a user to construct a project (computer program) and save it. The computer program can be run usinglearning center runtime 22. - Upon entering
learning center workshop 21, a user can elect to build a new computer program from scratch or open an existing computer program. - To build a computer program from scratch, the user utilizes blocks of programming instructions represented visually as building blocks within
learning center workshop 21. The tools provided withinlearning center workshop 21 allow a user to create a scene that includes a background, main objects and actors.Learning center workshop 21 allows a user to add computer program logic to the actors and the background. The user acts by dragging and dropping visual blocks into code areas. The visual blocks snap into place to form logic sequences. -
Learning center workshop 21 saves the computer program and all its assets as the computer program is being built. For example,learning center workshop 21 saves the computer program in a persistent format so that the computer program can be loaded later. This can be done, for example in a Javascript Object Notation (JSON) format, Extensible Markup Language (XML) or some other structured syntax. The computer program file may be stored on learningcenter server 12 and, in addition or instead of, stored on the learning center client used by the user. -
FIG. 3 shows auser interface 90 for learningcenter workshop 21. Anarea 90 shows a stage area where selected stages and actors are displayed. Anactor area 92 shows available actors that can be selected by a user. Ahorizontal bar 93 includes acode tab 94 and aproperties tab 95. Whencode tab 94 is selected available code blocks are shown displayed and available in acode block area 97. The code blocks can be dragged and dropped by a user intowork area 96 to form a computer program (project). - The flying bird project illustrated within
interface 90 shown inFIG. 3 can be stored in file using language blocks for the learning center. For example, Table 3 below sets out contents for such a file. -
TABLE 3 {“name”:“Tropical Bird”,“description”:“Help the tropical bird fly out to sea.”,“background”: {“scaleType”:“stretch”,“width”:-.-600,“height”:-.-400, “canvasWidth”:480,“canvasHeight”:320,“currentBackground”: 1,“scripts”:[ ],“backgrounds”:[{“name”:“beach”,“img”:“/assets/Images/ Backgrounds/Outdoor/ 4fcf9088692f886b16000e53.jpg”,“cx”:512,“cy”:341.5}],“sounds”:[ ],“ documentation”: {“description”:“”,“inboundEvents”:[ ],“outboundEvents”:[ ],“properFes “[ ]}}“sprites”:[{“label”:“Parrot”,“scripts”: [{“func”:“registerFlagTrigger”,“id”:6,“values”:[ ],“containers”:[ ],“next ”:{“func”:“blockControlForever”,“id”:7,“values”: [ ], “containers”:[{“func”:“blockLooksNextCostume”,“id”:8,“values”:[ ], “containers”:[ ]“next”: {“func”:“blockControlWait”,“id”:9,“values”:[{“type”:“number”,“value “:”.2”}],“containers”:[ ],“next”: {“func”:“blockMotionMove”,“id”:11,“values”:[{“type”:“number”, “value”:“6”}],“containers”:[ ],“next”: {“func”:“blockMotionBounceOnEdge”,“id”:13,“values”:[ ],“containers “:[ ]}}}}]],“x”:82,“y”:43}],“costumes”: [{“name”:“Bird 1”,“img”:“/assets/user/50312c85692f88c95000006b.png”}, {“name ”:“Bird 2”,“img”:“/assets/user/ 50312caa692f88ba5000007f.png”,“cx”:80.5,“cy”:56.5},{“name”:“Bird 3”,“img”:“/assets/user/ 50312cb1692f88d550000075.png”,“cx”:80.5,“cy”:56.5},{“name”:“Bir d 4”,“img”:“/assets/user/ 50312cb6692f88e050000078.png”,“cx”:80.5,“cy”:56.5}],“currentCost ume”:4,“sounds”:[ ],“scale”: 1.1090254493487,“x”:142.77345132738003,“y”:100.08159722222,“r otation”:180,“rotateLock”: 1,“isHidden”:false,“volume”:100,“locked”: false,“physics”: {“isStaFc”:false,“isAcFve”:true,“geometry”:“circular”,“density”:1,“fri ction”:0.5,“resFtution”:0.2},“varDefaults”: { },“variables”:{“auto start”:true,“distance”:6},“lists”: { },“classname”:“Parrot”,“id”:“50369e94692f885c770000c2”,“docume ntation”:{“description”:“Parrot flies around and screeches.“,”inboundEvents”:[{“name”:“[registerFlagTrigger]”,“de scription”:“”,“visible”:true}]“outboundEvents”: [ ],“properFes”:[{“name”:“auto start”,“description”:“If auto start=true, animate on start”,“visible”:true}, {“name”:“distance”,“description”:“Speed of movement”,“visible”:true}]}}],“models”:[ ],“variables”:{ },“lists”: { },“physics”:{“enabled”:false,“gravity”:{“x”:0,“y”:10}}} - A user can use
learning center runtime 22, shown inFIG. 1 , to run a computer program built in learningcenter workshop 21. For example, the interface by which the user accesseslearning center runtime 22 is a web browser on a computing device such as a personal computer or a tablet or is an app running on a mobile phone or tablet computer. For example, the user may iterate through refining a computer program by making modifications such as adding and removing logic, changing the code blocks used and testing the computer program to see how the computer program runs. -
FIG. 4 shows a simplified block diagram for learningcenter runtime 22. Learning center runtime 22 is shown to includescripts 31, objects 37, event lists 38 and ascheduler 39.User events 32,code events 33 and integrated development events (IDE) 34 are used to invokelearning center runtime 22. Learning center runtime 22 calls application programming interface (API) blocks.FIG. 4 also shows a script containing, for example, acontext stack 301, anactor description 302 and code blocks 303. The context stack includes stack items represented by astack item 305 having ablock pointer 306 andfunction parameters 307.Actor description 302 includesvariables 308 andproperties 309. -
FIG. 20 is a block diagram illustrating implementation of a project runtime.Project data 200 can utilize resources andmedia 201. When a project is loaded, the project is parsed and added to object lists for each object (actor). Each object entry in the object list maintains a reference to the scripts associated with an object (actor). The project resource and media pointers are referenced by various instructions in the scripts. For example a loadedproject 210 includes, for example, anobject list 202 andscripts 203. - Scripts are registered against specific event types (e.g. program start, key event, mouse event). As illustrated by
arrow 206, anexternal trigger event 205 results in ascript 203 that has been registered in event registrations list 204 being added to ascheduler 207, which is a list of running scripts.Run loop 209 picks up a script to execute fromscheduler 207. The scripts are executed in parallel byscheduler 207.Scheduler 207 determines how to select the next script (e.g. round robin, priority queue, time queue). The execution context is restored from aruntime stack 208 specific to that script. The instruction is executed as a non-blocking process. - For example, within a
project runner 209, in a block 211 a next script is fetched from the scheduler. In ablock 212, execution context is restored for the fetched script. In ablock 213 an instruction is run. In ablock 214, context is moved to a next instruction. As illustrated byarrow 216, block 213 and block 214 are continued until there is a context switch. A context switches occurs, for example, when the script has executed a yield instruction, a time slice expires, user interrupts execution, etc. When there is a context switch, in ablock 215, execution context is saved and context is returned to block 211. If the end of the script has not been reached, the script is retained in thescheduler 207. If the end of the script has been reached, the script is removed fromruntime stack 208 and the list of running scripts withinscheduler 207. - For example, for learning
center runtime 22,scripts 31 are written using Javascript. Javascript is a single-threaded environment in a web browser. A sequence of instructions is executed sequentially until the sequence relinquishes control back to the web browser before other instruction sequences will execute. As a result, multiple Javascript sequences cannot run at the same time. - For example, the learning center represents instructions as blocks so that each block represents one instruction that executes atomically, that is without being interrupted by another block. Each block must relinquish control back to the web browser in a timely fashion.
Scheduler 39, therefore, maintains a context for each script sequence.Scheduler 39 selects a script sequence, switches to that script's context and executes a predetermined number of blocks for each turn.Scheduler 39 then selects the next script sequence and repeats until all scheduled scripts have run their turn. At thispoint scheduler 39 relinquishes control back to the web browser. The web browser starts up another time slice where another script sequence is executed. As a result,multiple scripts 31 can be run at the same time. -
FIG. 5 is a simplified flowchart illustratinglearning center workshop 21 publishing a project (learning center program) tolearning center runtime 22. As illustrated inFIG. 5 , storage forprojects 51 and storage forassets 52 are accessed to obtain a project and associated assets. Also configuration (such as user input control mapping, display scaling, desired screen orientation)information 53 is also accessed. - In a
block 41, a project is loaded. In ablock 42, assets are iterated. In ablock 43, assets are fetched fromassets storage 52. In ablock 44, paths to assets are resolved and rewritten. In ablock 45, optimization is performed. For example, the optimization can include removing assets not used by a target, as shown in ablock 47. Likewise, the optimization can include recompressing and/or scaling assets for the target, as shown in ablock 48. Also, the optimization can include native code generation, as shown in ablock 49. - In a
block 46 the project is packaged based on a platform specific runtime, as illustrated by ablock 50. - Once a computer program (project) is complete, a user can choose to create a lesson module based on the computer program. For example, the user can choose a create lesson option in learning
center workshop 21 to activatelearning module generator 23. -
Learning module generator 23 includes a parser that parses through the computer program that the user built and generates a task list for the lesson module. For example,learning module generator 23 reads through the computer program, identifies all objects and identifies actions to recreate the computer program. Then, different kinds of steps are generated based on the results of parsing the computer program. A list of ordered steps are generated where complex tasks are outlined and grouped together. - As shown in
FIG. 6 , a drop downmenu 99 accessed by a user from the “Tropical Bird” label on the menu bar ofuser interface 90 includes a selection to “Create Lesson”. As a result, learning module generator 23 (shown inFIG. 2 ) is invoked and generates a lesson module from the computer program (shown in Table 3) for the flying bird project. -
FIG. 7 shows inbox 100, appearing as part ofinterface 90, lesson steps. As discussed above, the author of the lesson module can modify the lesson module generated by changing the order of steps, adding voice over by selecting a voice overbutton 98, and so on. -
FIG. 8 is a simplified flowchart showing howlearning module generator 23 generates a lesson module from a computer program. In ablock 61,learning module generator 23 iterates through objects in the project (computer program). In ablock 62,learning module generator 23 sets upnon-coding properties 67.Non-coding properties 67 include, for example, names 68,costumes 69, sounds 70,stage properties 71 andactor properties 72. - In a
block 63,learning module generator 23 iterates through scripts. This is done, for example, to discover dependencies between messages and actors, etc., as shown inblock 64, to sequence script steps by dependencies, as shown inblock 65, and to determine cyclic dependencies and establish a preference for definitions, as shown inblock 66. - As represented by
arrow 75,learning module generator 23 then generates a sequential list ofsteps 76. As illustrated byblock 73, a user can add notations to sequential list ofsteps 76. As illustrated by block 74, a user can reorder steps within sequential list ofsteps 76. - Once the list or ordered steps are generated, the user can customize the lesson module. For example, the user can change the order of steps so that the reconstruction of the steps of computer program occurs in a different order than the steps as they originally appeared in the in the computer program when authored.
Learning module generator 23 is used to assure that dependencies between steps are accounted for. - For example,
learning module generator 23 allows a user to add voice over in each step. The voice over is played back while the lesson module is being run within learningcenter runtime 22. Similarly,learning module generator 23 allows a user to add video in any step. The video is played back while the lesson module is being run within learningcenter runtime 22. Also,learning module generator 23 allows additional steps to be added in between the steps for the lesson module originally generated by learningmodule generator 23. For example, text for the lesson module can be customized. When the user has completed modifications,learning module generator 23 saves the workflow as a lesson module. -
FIG. 9 illustrates an alternative method for generating a lesson module from a computer program. In a block 81, a user comments on blocks of programming code. In ablock 82, the user comments on program properties in integrated development environment (IDE).Learning module generator 23 stores a list of collectedcomments 83 that includes additional user addedannotations 84 and reorderedsteps 85. From list of collectedcomments 83,learning module generator 23 generates, as illustrated by an arrow 86 a new sequential list ofsteps 87 used to produce a complexproject lesson module 88. - Table 4 shows an example of computer program for a complex project lesson produced based on language blocks from the flying bird project set out in Table 3:
-
TABLE 4 {“width”:-600,“height”:-400,“bgtype”:“stretch”,“canvasWidth”:480,“canvasHeig ht”:320,“name”: “Tropical Bird”,“description”:“Help the tropical bird fly out to sea.”, resources“:[{“count”:1,“name”:“beach.jpg”,“img”: “\/assets\/Images\/ Backgrounds\/Outdoor\/4fcf9088692f886b16000e53.jpg”},{“count”:1,“name“:” Bird 1.png”,“img”:“\/assets\/user\/50312c85692f88c95000006b.png”}, {“count”:1,“name”:“Bird 2.png”,“img”:“\/assets\/user\/ 50312caa692f88ba5000007f.png”},{“count”:1,“name”:“Bird 3.png”,“img”:“\/assets\/user\/50312cb1692f88d550000075.png”},{“count”: 1 ,“name”:“Bird 4.png”,“img”: “\/assets\/user\/50312cb6692f88e050000078.png”}],“blocks”: [{“count”:1,“func”:“registerFlagTrigger”},{“count”:1,“func”:“blockControlForever ”},{“count”:1,“func”:“blockLooksNextCostume”},{“count”:1,“func”:“blockCon trolWait”},{“count”:1,“func”:“blockMotionMove”},{“count”:1,“func”: “blockMotionBounceOnEdge”}],“notes”:[{“block”:null,“spriteName”:null, “resource”:“\/assets\/Images\/Backgrounds\/Outdoor\/4fcf9088692f886b16 000e53.jpg”,“resourceName”:“beach”,“attached”:null,“id”:1,“text”:“ Let's select a Background for our tropical scene<\/h2> Click Add to open the Media Library. Select the Background to add it to the stage. <\/p><\/div>\n”},{“block”:null,“spriteName”:“Parrot”,“resource”:“\/assets\/us er\/50312c85692f88c95000006b.png”,“resourceName”:“ Bird 1”,“attached”:null,“properFes”:{“x”:442.77345132738,“y”:99.91840277778, “rotation”:180,“rotateLock”:1,“scale”:1.1090254493487},“id”:2,“text”:“ Add a tropical bird to the Stge<\/h2> Open the Media Library to add the Parrot<\/em> to the Stage.<\/p><\/div>\n”},{“block”:null,“spriteName”: “Parrot”,“resource”:“\/assets\/user\/50312caa692f88ba5000007f.png”, “resourceName”:“ Bird 2”,“attached”:null,“id”:3,“text”:“Let's make an animation<\/h2> Let's add a couple costumes to the Parrot that will let us make an animation. Later we will make it move as well. Add the first Costume from the Media Library to the Parrot.<\/p><\/div>\n”}, {“block”:null,“spriteName”:“Parrot”,“resource”:“\/assets\/user\/50312cb169 2f88d550000075.png”,“resourceName”:“Bird 3”,“attached”:null,“id”:4, “text”:“ Two more to go...<\/h2> Add the next Costume from the Media Library to the Parrot.<\/p><\/div>\n”},{“block”:null,“spriteName”:“Parrot”,“resource”:“\/ass ets\/user\/50312cb6692f88e050000078.png”,“resourceName”:“Bird 4”,“attached”:null,“id”:5,“text”:“ Last one...<\/h2> Add the costume \“9\” from the media library to \“Parrot\”.<\/p><\/div>\n”},{“block”:“registerFlagTrigger”,“spriteName”: “Parrot”,“resource”:null,“attached”:null,“id”:6,“text”:“ Now that we have everything set up, let's make it work!<\/h2> Drag the start<\/em> block to Parrot's code.<\/p><\/div>\n”}, {“block”:“blockControlForever”,“spriteName”:“Parrot”,“resource”:null,“attac hed”:[“registerFlagTrigger”,6],“id”:7,“text”:“ Remember how to animate?<\/h2> Just like we did in previous activities, start by adding the forever loop <\/em> block to the start<\/em> block in Parrot's code.<\/p><\/div>\n”},{“block”:“blockLooksNextCostume”,“spriteName”: “Parrot”,“resource”:null,“attached”:[“blockControlForever”,7,0],“id”:8,“text”:“ Adding animation logic to the Parrot<\/h2> Add the next costume<\/em> block into the forever loop<\/em> block in Parrot's code. This allows us to keep changing costumes to get the animation effect.<\/p><\/div>\n”}, {“block”:“blockControlWait”,“spriteName”:“Parrot”,“resource”:null,“attached ”:[“blockLooksNextCostume”,8],“id”:9,“text”:“ Adding animation logic to the Parrot<\/h2> Add the wait<\/em> block to the next costume<\/em> block in Parrot's code. Without a wait block, the Parrot flaps it's wings too fast. To get a better effect we need to slow it down.<\/p><\/div>\n”}, {“block”:null,“spriteName”:“Parrot”,“resource”:null,“attached”:[“blockControl Wait”,9,0,“.2”],“id”:10,“text”:“ Adjusting our animation<\/h2> Set the value of the wait<\/em> block to .2<\/em> in Parrot's code. This will allow for a better animation effect by slowing down how fast the costumes change.<\/p><\/div>\n”}, {“block”:“blockMotionMove”,“spriteName”:“Parrot”,“resource”:null, “attached”:[“blockControlWait”,9],“id”:11,“text”:“ Now that the Parrot knows how to fly, let's make it move<\/h2> Add the move<\/em> block to the wait<\/em> block in Parrot's code.<\/p><\/div>\n”},{“block”:null,“spriteName”:“Parrot”,“resource”:null, “attached”:[“blockMotionMove”,11,0,“6”],“id”:12,“text”:“ Set the speed of the Parrot<\/h2> The value of the move block determines the number of steps that the bird makes in every cycle of the loop. Set the value of the move<\/em> block to 6<\/em> in the Parrot's code.<\/p><\/div>\n”}, {“block”:“blockMotionBounceOnEdge”,“spriteName”:“Parrot”,“resource”: null,“attached”:[“blockMotionMove”,11],“id”:13,“text”:“ Last step, don't let the bird fly away<\/h2> If we were to run the program right now, the bird would just fly off the Stage. We can easily fix this by adding the bounce on edge<\/em> block to the move<\/em> block in the Parrot's code. This is the easiest way to make the Parrot turn around when it gets to the edge of the Stage.<\/p><\/div>\n”}],“ownerid”:“4fc97d5d692f883a79004c38”,“details”:“ Watch the bird fly back and forth across the Stage.”,“concepts”:“This project combines the forever loop, animation and motion to make the bird fly across the Stage. The animation is simulated by using Next Costume<\/em> in the forever loop. The Move 6Steps<\/em> block moves the bird in the direction it is pointing. If on edge, bounce<\/em> is the block that detects that the bird hits the end of the Stage and turns it around. Used in combination, it appears that the bird is flying across the Stage. ”} -
Learning module engine 24, shown inFIG. 2 , is invoked when a user runs a lesson module. For example, a user from a learning center client utilizes learningcenter workshop 21 through an interface to invoke the lesson module. - For example, the interface is a web browser on a computing device such as a personal computer or a tablet. For example, when learning is launched, a user chooses to run the lesson module using a browser. Then, learning
module engine 24 takes over and guides the user to complete the lesson within the lesson module. - For example, learning
module engine 24 displays a lesson bar that shows the steps that the user must perform. The area of the screen that the user must work on is highlighted and in order to proceed, the user must complete a current task. For example, learningmodule engine 24 provides the user withreal-time help such as a “Hint/Show Me” button.Learning module engine 24 also plays any voice over or video associated with the lesson module.Learning module engine 24 also, for example, provides a user with an option to fast forward several steps in a larger task and an option to step backwards. - For example, learning
module engine 24, while the user adds logic, highlights the source and target areas of the task. If the user makes a mistake, learningmodule engine 24 takes the user back to a known state. Once the user has recreated the original program, the lesson is complete. The user can then uselearning module generator 23 to modify the lesson module. - For example, learning
module engine 24 can also operate in other modes. For example, learningmodule engine 24 can include a mode where a user can open a lesson module and learningmodule engine 24 will animate the lesson module to a certain step. Similarly, learningmodule engine 24 can include a mode where a lesson module is run in slow motion continuously with voiceover. This mode can be useful, for example, when a user wants to generate a video. -
FIG. 10 is a simplified block diagram illustrating operation of learningmodule engine 24. Lessons produced by learningmodule generator 23 are stored inlessons storage 101. Alesson loader 105 within learningmodule engine 24 sequentially loads assets, computer programming blocks and lesson steps respectively stored as assets used 102, blocks used 103 and lesson steps 104 withinlessons storage 101. Lesson loader loads lesson data and adds to data structures the media assets fromassets 102 that will be used. Media assets include, for example, images and sounds. - From within a
lesson runner 117, aget instruction block 115 fetches an instruction within the instructions loaded bylesson loader 105. The instruction may include, for example, lessons steps from lesson steps 104, assets fromassets 102 and blocks from blocks used 103. Getinstruction 115 determines the type of instruction and passes it to the appropriate lesson step handler. - A determine type block 106 within learning
module engine 24 sequentially handles instructions fromlesson loader 105 and determines instruction type. - For a plain note, the message is displayed and/or spoken. This is an informational message requiring either a timeout or user acknowledgement to continue. This is represented in
FIG. 10 where for anote 107, learningmodule engine 24 displays a message, as represented by ablock 108. - When a resource instruction is run, the resources that are to be used when hints are turned on are highlighted. The lesson step instructions are displayed and/or spoken with entered explanations from the lesson creator. A check is performed that the resource was placed in the correct place by checking the associated project data structures for the correct placement. This is represented in
FIG. 10 where for aresource instruction 109, learningmodule engine 24 displays a workshop window and highlights change, as represented by ablock 110.Learning module engine 24 also validates settings, as represented by ablock 111. - A code block instruction, when run, highlights the block to be used when hints are turned on and shows where the block should be placed on the code canvas. The lesson step instructions are displayed and/or spoken with entered explanations from the lesson creator. A check is made that the block was placed in the correct place by checking the associated project code data structures. If validation is not successful, a message appears offering some hints. For example, the hints might include such things as animating actions, highlighting location on the display or masking location on the display.
- Users are optionally allowed to proceed to the next step, in which case the lesson runner performs the action on behalf of the user. If validation was successful, the next lesson step is executed. This is represented in
FIG. 10 where for acode block instruction 112, learningmodule engine 24 displays code and highlight blocks, as represented by ablock 113.Learning module engine 24 also validates programming blocks, as represented by ablock 114. - After an instruction is processed, in a
block 115, a next instruction is obtained. The lesson proceeds until no more steps, at which point the runner can offer additional activities or the user (lesson creator) can embed additional activities that can be done. -
FIG. 11 shows awelcome window interface 120. Anarea 125 provides announcements to a user. Anarea 121 allows the user to select an action. Anarea 122 allows the user to select a project (computer program) to run. InFIG. 11 , acursor 123 illustrates the user selecting anicon 124 for the tropical bird lesson. Selection oficon 124 brings up an interface for the tropical bird and activates learningmodule engine 24 to run a project lesson for the tropical bird, as illustrated byFIG. 12 . -
FIG. 12 showsuser interface 90 with the added addition of abar 130 through whichlearning module engine 24 communicates with the user. As illustrated byFIG. 13 ,learning module engine 24 communicates to the user next steps in the lesson and also can provide visual instruction by adjusting entities withinwork area 96 andcode block area 97. For example, when a user selects the “Show Me” button within learningbar 130 as shown inFIG. 13 ,learning module engine 24 provides animation of the move block incode block area 97 being added to the wait block shown inwork area 96. - For example, the Learning Center also allows the creation and running of puzzle type lessons with system validating success and failure type triggers.
- That is, a puzzle is an example of a special kind of lesson that has built in validation. For example, the puzzle has a specific success criteria that the author defines, such as: “Make the robot go to the green square.”
- The author of a puzzle lesson module builds the project (computer program) using learning center workshop. When building the lesson modules, the author uses two special blocks of code: a success criteria block and a failure criteria block. The author uses the blocks to define success and failure and to indicate the consequences of success and failure. The author then uses
learning module generator 23 to generate a lesson module for the project. - When a user opens the project in a lesson running mode, upon a user completing an action, learning
module engine 24 will check whether the success or failure criteria are valid.Learning module engine 24 will then execute the consequences of success or failure, as appropriate. This is illustrated inFIG. 14 . -
FIG. 14 shows how a lesson creator can define a success criteria block 141, a failure criteria block 142 and a failure criteria block 143 withinwork area 96 ofinterface 90 while creating a lesson or puzzle with usinglearning center workshop 21. -
FIG. 15 is a block diagram that illustrates alesson module 151 that includesscripts 150 that include asuccess block 154 and afailure block 155. These are utilized by learningcenter workshop 21 to construct a project to be run by alesson runner 160 to run the lesson. - For example, the learning center allows a user to define activities that can be automatically validated by the learning runtime. For example, a task is presented to the student to accomplish a goal such as to write code to move a golf ball into a hole. The student creates the code. In order to check whether the code accomplishes the task, code blocks that the student has added can be checked to see that the code blocks are in the correct order. Alternatively, a trigger methodology can be used to determine whether the task was accomplished.
- For example, a trigger is assigned to the objects that a user manipulates. The trigger is based on whether a criteria placed within the computing program has been satisfied. For example the objects are a ball and a hole. The triggers are hidden from the user. The triggers are code instructions that check for the criteria, as delineated by parameters. If the parameters are satisfied, the trigger is fired, and the process that checks that the code can determine whether the user accomplished the task. For example, a geometric criteria specifies that a ball must travel a certain distance. For example, a hole trigger checks that the ball is within the bounds of the hole.
- In addition, other types of criteria can be used. For example, a time-based criteria indicates whether a task is completed within a specified amount of time. For example, did a mouse finish a maze in under 8 seconds? A code based criteria determines whether code used to accomplish a task is within predetermined parameters. For example, was a lesson completed using under 8 code blocks and without using recursion? Value-based criteria determine whether a particular value was reached. For example, was a score greater than 25? Event criteria determine whether a certain event criteria was received. For example, was a message sent by one of the actors? A physics based criteria indicates a physical property or phenomena occurred. For example, did a cannon ball reach a velocity of at least 25 meters per second? An external physical criteria indicates some real activity outside the program occur. For example, did a sensor on a physical stage robot move 10 feet?
-
FIG. 16 illustrates validation during a lesson runtime. For example,lesson module 151 is run within learningcenter workshop 21 that callslesson runner 160 to run the lesson module.Lesson runner 160 includesproperties 161 and acode canvas 162. Actions onproperties 161 orcode canvas 162 triggersproperty changes 163, assets addedevents 164 andblock events 165.Block events 165 include, for example, block attach events, block detach events, block add events and block delete events. - An activity monitor 177 within
lesson runner 160 includes atimer module 166, aresource monitor 167 and anevent monitor 168.Lesson runner 160 performs a comparefunction 169 with astep list 170.Step list 170 includessteps 171, resources used 175 and blocks used 176. Each ofsteps 171 may be an asset to addstep 177, a code to addstep 173 or a property to changestep 174. -
FIG. 17 gives an example of awindow 179 that may appear overinterface 90 that gives instructions for a lesson module that includes validation. -
FIG. 18 gives an example of awindow 181 that appears overinterface 90 when a user fails to perform a lesson properly. Blow upsection 182 ofwork area 96 shows the incorrect code blocks. -
FIG. 19 gives an example of a window 191 that appears overinterface 90 when user performs a lesson properly. Blow upsection 192 ofwork area 96 shows the correct code blocks. - After a project author generates a lesson module within learning
center server 12, the author can make the lesson module available to other users of learningcenter server 12. For example, other users are charged a fee for using a lesson module made available by an author and the author receives a portion of the fee, based, for example, on the number of other users that use the lesson module. - For example, an author is reimbursed based on tracking the number of times another user views or completes a lesson authored by the author. For example, an author gets paid $2 for every 1000 lesson views inside a paid course authored by the author. Alternatively, an author can sell a lesson for a flat fee.
- The foregoing discussion discloses and describes merely exemplary methods and implementations. As will be understood by those familiar with the art, the disclosed subject matter may be embodied in other specific forms without departing from the spirit or characteristics thereof. Accordingly, the present disclosure is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.
Claims (11)
1. A computer implemented method comprising:
receiving, by a computer system coupled to the Internet, an access request transmitted over the Internet from a first client computing device of a first user;
in response to the access request, causing, by the computer system, a plurality of programming blocks to be visually displayed as building blocks on a user interface of the first client computing device;
receiving, by the computer system over the Internet, a selection of programming blocks connected together by the first user via the user interface of the first computing device;
generating, by the computer system, a computer program based on the programming blocks being connected together;
generating, by the computer system and based on the computer program, an ordered list of steps for a learning module, wherein the ordered list of steps identifies actions to be taken by a second user to re-create the computer program, wherein generating the ordered list of steps for the learning module includes:
discovering dependences within the computer program; and
ordering the list of steps based on the dependencies;
in response to input from the first user via the user interface, adding a notation to the ordered list of steps and reordering steps in the ordered list of steps;
transmitting the lesson module, by the computer system over the Internet, to a second client computing device of a second user; and
causing, by the computer system, the lesson to run on the second client computing device to guide the second user in re-creating the computer program.
2. The method of claim 1 , wherein generating the ordered list of steps for the learning module additionally includes:
giving steps that include definitions that are used in steps having cyclical dependencies priority to be placed earlier in the ordered list of steps.
3. The method of claim 1 , wherein generating the ordered list of steps for the learning module includes:
analyzing objects used by the computer program to determine actor names, actor costumes, sounds, stage properties and actor properties.
4. The method of claim 1 , additionally comprising:
marketing the learning module to a plurality of other users in communication with the computer system; and,
providing compensation to the first user that connected the programming blocks upon which the computer program is based when marketing of the learning module to one or more of the other users is successful.
5. The method of claim 1 , additionally comprising:
receiving voice input from the first user via the user interface; and
adding the voice input as a notation to one or more of the ordered list of steps, whereby the voice notation is played while the lesson module is run.
6. The method of claim 1 , wherein the plurality of programming blocks includes:
a success trigger block that displays a success message when a user of the computer program satisfies a criteria within the computer program; and
a failure trigger block that displays a failure message when a user of the computer program fails to satisfy a criteria within the computing program.
7. The method of claim 6 , wherein the criteria is selected from the group consisting of:
a geometric criteria;
a time-based criteria;
a code based criteria;
a value-based criteria;
a physics based criteria;
an external physical criteria; and
combinations thereof.
8. The method of claim 1 , wherein running the lesson module on the second client computing device includes:
displaying the ordered list of steps in the lesson module to the second user;
monitoring actions of the second user to determine if correct actions are taken; and
providing a hint to the second user in response to determining, based on monitoring the actions of the second user, that the second user has problems completing the lesson module.
9. The method of claim 8 , wherein the hint is selected from the group consisting of:
animating an action;
highlighting a location on a display;
masking a location on the display; and
combinations thereof.
10. A tangible, non-transitory computer-readable medium storing instructions that, when executed by a computer system coupled to the Internet, cause the computer system to:
receive an access request transmitted over the Internet from a first client computing device of a first user;
in response to the access request, cause a plurality of programming blocks to be visually displayed as building blocks on a user interface of the first client computing device;
receive, over the Internet, a selection of programming blocks connected together by the first user via the user interface of the first computing device;
generate a computer program based on the programming blocks being connected together;
generate, based on the computer program, an ordered list of steps for a learning module, wherein the ordered list of steps identifies actions to be taken by a second user to re-create the computer program, wherein generating the ordered list of steps for the learning module includes:
discovering dependences within the computer program; and
ordering the list of steps based on the dependencies;
in response to input from the first user via the user interface, add a notation to the ordered list of steps and reorder steps in the ordered list of steps;
transmit the lesson module, by the computer system over the Internet, to a second client computing device of a second user; and
cause, by the computer system, the lesson to run on the second client computing device to guide the second user in re-creating the computer program.
11. A system coupled to the Internet comprising:
a processor; and
memory coupled to the processor and storing instructions that, when executed by the processor, cause the system to:
receive an access request transmitted over the Internet from a first client computing device of a first user;
in response to the access request, cause a plurality of programming blocks to be visually displayed as building blocks on a user interface of the first client computing device;
receive, over the Internet, a selection of programming blocks connected together by the first user via the user interface of the first computing device;
generate a computer program based on the programming blocks being connected together;
generate, based on the computer program, an ordered list of steps for a learning module, wherein the ordered list of steps identifies actions to be taken by a second user to re-create the computer program, wherein generating the ordered list of steps for the learning module includes:
discovering dependences within the computer program; and
ordering the list of steps based on the dependencies;
in response to input from the first user via the user interface, add a notation to the ordered list of steps and reorder steps in the ordered list of steps;
transmit the lesson module, by the computer system over the Internet, to a second client computing device of a second user; and
cause, by the computer system, the lesson to run on the second client computing device to guide the second user in re-creating the computer program.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/457,540 US20170186337A1 (en) | 2012-12-14 | 2017-03-13 | Programming learning center |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/715,417 US9595202B2 (en) | 2012-12-14 | 2012-12-14 | Programming learning center |
US15/457,540 US20170186337A1 (en) | 2012-12-14 | 2017-03-13 | Programming learning center |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/715,417 Continuation US9595202B2 (en) | 2012-12-14 | 2012-12-14 | Programming learning center |
Publications (1)
Publication Number | Publication Date |
---|---|
US20170186337A1 true US20170186337A1 (en) | 2017-06-29 |
Family
ID=50931339
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/715,417 Active 2033-04-13 US9595202B2 (en) | 2012-12-14 | 2012-12-14 | Programming learning center |
US15/457,540 Abandoned US20170186337A1 (en) | 2012-12-14 | 2017-03-13 | Programming learning center |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/715,417 Active 2033-04-13 US9595202B2 (en) | 2012-12-14 | 2012-12-14 | Programming learning center |
Country Status (1)
Country | Link |
---|---|
US (2) | US9595202B2 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10276061B2 (en) | 2012-12-18 | 2019-04-30 | Neuron Fuel, Inc. | Integrated development environment for visual and text coding |
US10510264B2 (en) | 2013-03-21 | 2019-12-17 | Neuron Fuel, Inc. | Systems and methods for customized lesson creation and application |
US11699357B2 (en) | 2020-07-07 | 2023-07-11 | Neuron Fuel, Inc. | Collaborative learning system |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017150788A1 (en) | 2016-03-03 | 2017-09-08 | 전석주 | Electronic block system for programming education |
WO2018187029A1 (en) | 2017-04-03 | 2018-10-11 | Innovation First, Inc. | Mixed mode programming |
WO2018229797A1 (en) * | 2017-06-15 | 2018-12-20 | Grasp Io Innovations Pvt Ltd. | Interactive system for teaching sequencing and programming |
US11119735B1 (en) * | 2017-09-01 | 2021-09-14 | Modkit Inc. | Universal hybrid programming environment |
IT201800009636A1 (en) * | 2018-10-19 | 2020-04-19 | Universita' Politecnica Delle Marche | APPARATUS FOR MONITORING ASSEMBLY AND PROGRAMMING OPERATIONS. |
TWI695354B (en) * | 2018-11-21 | 2020-06-01 | 呂英璋 | Computer programming learning system |
US11610059B2 (en) | 2018-12-07 | 2023-03-21 | Interject Data System, Inc. | Systems and methods for a visual interface for grid-based programs |
GB2580332A (en) * | 2018-12-31 | 2020-07-22 | Robotify Labs Ltd | A computer program product and method for teaching computer programming |
GB2602860A (en) * | 2020-07-28 | 2022-07-20 | Robogram Co Ltd | Method for processing block coding for programme education |
KR102228085B1 (en) * | 2020-07-28 | 2021-03-12 | 임상희 | Method for processing block coding for programming education |
CN112102666A (en) * | 2020-09-25 | 2020-12-18 | 深圳市易科诺科技发展有限公司 | Programming language online education learning method |
Citations (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4395236A (en) * | 1981-10-02 | 1983-07-26 | Gotthold John P | Method of generating process instructions |
US5442759A (en) * | 1992-03-26 | 1995-08-15 | International Business Machines Corporation | Interactive online tutorial system with user assistance function for software products |
US20050114776A1 (en) * | 2003-10-16 | 2005-05-26 | Leapfrog Enterprises, Inc. | Tutorial apparatus |
US20050120333A1 (en) * | 2002-02-18 | 2005-06-02 | Katsuro Inoue | Software component importance evaluation system |
US20060053372A1 (en) * | 2004-09-08 | 2006-03-09 | Transcensus, Llc | Systems and methods for teaching a person to interact with a computer program having a graphical user interface |
US20060228689A1 (en) * | 2005-04-12 | 2006-10-12 | Rajaram Kishore K | Interactive tutorial system and method |
US20060259314A1 (en) * | 2003-09-22 | 2006-11-16 | Lilach Furman | Reading device |
US20080009275A1 (en) * | 2004-01-16 | 2008-01-10 | Werner Jon H | Location-aware fitness training device, methods, and program products that support real-time interactive communication and automated route generation |
US20080059484A1 (en) * | 2006-09-06 | 2008-03-06 | K12 Inc. | Multimedia system and method for teaching in a hybrid learning environment |
US20100041007A1 (en) * | 2008-08-13 | 2010-02-18 | Chi Wang | Method and System for Knowledge Diagnosis and Tutoring |
US20100050154A1 (en) * | 2008-08-20 | 2010-02-25 | International Business Machines Corporation | System, method and program product for guiding correction of semantic errors in code using collaboration records |
US20100099954A1 (en) * | 2008-10-22 | 2010-04-22 | Zeo, Inc. | Data-driven sleep coaching system |
US20110189645A1 (en) * | 2010-01-29 | 2011-08-04 | Daniel Leininger | System and method of knowledge assessment |
US20120040326A1 (en) * | 2010-08-12 | 2012-02-16 | Emily Larson-Rutter | Methods and systems for optimizing individualized instruction and assessment |
US8185832B2 (en) * | 2001-08-14 | 2012-05-22 | National Instruments Corporation | Graphical deployment of a program to a device which displays the program connected to the device |
US20120198419A1 (en) * | 2011-02-02 | 2012-08-02 | Neill Allan W | User input auto-completion |
US20120329030A1 (en) * | 2010-01-29 | 2012-12-27 | Dan Joseph Leininger | System and method of knowledge assessment |
US8412652B2 (en) * | 2009-03-04 | 2013-04-02 | Yahoo! Inc. | Apparatus and methods for operator training in information extraction |
US8467954B2 (en) * | 2002-08-05 | 2013-06-18 | Sony Corporation | Electronic guide system, contents server for electronic guide system, portable electronic guide device, and information processing method for electronic guide system |
US20140173563A1 (en) * | 2012-12-19 | 2014-06-19 | Microsoft Corporation | Editor visualizations |
US20140170275A1 (en) * | 2011-06-21 | 2014-06-19 | Icookit Pty Ltd. | System For Automating Cooking Steps |
US20140289142A1 (en) * | 2012-10-31 | 2014-09-25 | Stanley Shanlin Gu | Method,Apparatus and System for Evaluating A Skill Level of A Job Seeker |
US20160209908A1 (en) * | 2014-10-31 | 2016-07-21 | Texas State University | Cloud-based integrated system for developing and evaluating energy efficient software |
US9612826B2 (en) * | 2014-07-31 | 2017-04-04 | Facebook, Inc. | Attributing authorship to segments of source code |
Family Cites Families (67)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5535422A (en) | 1992-03-26 | 1996-07-09 | International Business Machines Corporation | Interactive online tutorial system for software products |
US5517663A (en) * | 1993-03-22 | 1996-05-14 | Kahn; Kenneth M. | Animated user interface for computer program creation, control and execution |
US5787422A (en) | 1996-01-11 | 1998-07-28 | Xerox Corporation | Method and apparatus for information accesss employing overlapping clusters |
US5727950A (en) | 1996-05-22 | 1998-03-17 | Netsage Corporation | Agent based instruction system and method |
US5745738A (en) | 1996-05-29 | 1998-04-28 | Microsoft Corporation | Method and engine for automating the creation of simulations for demonstrating use of software |
US6149441A (en) | 1998-11-06 | 2000-11-21 | Technology For Connecticut, Inc. | Computer-based educational system |
US6493869B1 (en) * | 1999-05-28 | 2002-12-10 | Microsoft Corporation | Inheriting code in a transformational programming system |
US7519905B2 (en) | 1999-10-12 | 2009-04-14 | Webmd Corp. | Automatic formatting and validating of text for a markup language graphical user interface |
US20010039552A1 (en) * | 2000-02-04 | 2001-11-08 | Killi Tom E. | Method of reducing the size of a file and a data processing system readable medium for performing the method |
US20020087416A1 (en) | 2000-04-24 | 2002-07-04 | Knutson Roger C. | System and method for providing learning material |
US6468085B1 (en) * | 2000-07-28 | 2002-10-22 | Assessment Technology Inc. | Scale builder and method |
US7370315B1 (en) | 2000-11-21 | 2008-05-06 | Microsoft Corporation | Visual programming environment providing synchronization between source code and graphical component objects |
IL156571A0 (en) | 2000-12-22 | 2004-01-04 | Aidentity Matrix Medical Inc | Multi-agent collaborative architecture for problem solving and tutoring |
US7917888B2 (en) | 2001-01-22 | 2011-03-29 | Symbol Technologies, Inc. | System and method for building multi-modal and multi-channel applications |
US6974328B2 (en) | 2001-06-08 | 2005-12-13 | Noyo Nordisk Pharmaceuticals, Inc. | Adaptive interactive preceptored teaching system |
JP2003021999A (en) | 2001-07-06 | 2003-01-24 | Univ Saga | System and method for creating educational material |
US20050079477A1 (en) | 2001-11-01 | 2005-04-14 | Automatic E-Learning, Llc | Interactions for electronic learning system |
AU2001298079A1 (en) | 2001-11-22 | 2003-06-10 | Liqwid Krystal India Private Limited | System and method for software learning |
US7152229B2 (en) | 2002-01-18 | 2006-12-19 | Symbol Technologies, Inc | Workflow code generator |
US20040076941A1 (en) | 2002-10-16 | 2004-04-22 | Kaplan, Inc. | Online curriculum handling system including content assembly from structured storage of reusable components |
US7603664B2 (en) | 2002-10-22 | 2009-10-13 | Sun Microsystems, Inc. | System and method for marking software code |
US20040209231A1 (en) | 2003-03-10 | 2004-10-21 | Merritt Nelson A. | Second language learning system |
US8750782B2 (en) | 2003-04-02 | 2014-06-10 | Joseph M. Scandura | Building and delivering highly adaptive and configurable tutoring systems |
US20040229194A1 (en) | 2003-05-13 | 2004-11-18 | Yang George L. | Study aid system |
US7266677B1 (en) | 2003-09-25 | 2007-09-04 | Rockwell Automation Technologies, Inc. | Application modifier based on operating environment parameters |
US20050089825A1 (en) | 2003-10-23 | 2005-04-28 | Ralph Zulferino | System for educating, study assistance and/or training of computer users |
US20050147946A1 (en) * | 2003-12-31 | 2005-07-07 | Shankar Ramamurthy | Automatic object generation and user interface identification |
US20050175970A1 (en) | 2004-02-05 | 2005-08-11 | David Dunlap | Method and system for interactive teaching and practicing of language listening and speaking skills |
US20060130007A1 (en) | 2004-12-01 | 2006-06-15 | International Business Machines Corporation | Computer method and apparatus for automating translation to a modeling language |
US20060179420A1 (en) | 2004-12-22 | 2006-08-10 | Alireza Ebrahimi | Web VPCL (web visual plan construct language: a visual system and method for learning and teaching programming and problem solving) |
US7844958B2 (en) | 2005-03-11 | 2010-11-30 | Aptana, Inc. | System and method for creating target byte code |
US20070130112A1 (en) | 2005-06-30 | 2007-06-07 | Intelligentek Corp. | Multimedia conceptual search system and associated search method |
US8230059B1 (en) | 2005-11-08 | 2012-07-24 | Hewlett-Packard Development Company, L.P. | Method of monitoring resource usage in computing environment |
US8683358B2 (en) * | 2005-12-01 | 2014-03-25 | Cypress Semiconductor Corporation | Application element group operations allowing duplication while preserving interdependent logic |
US8572560B2 (en) | 2006-01-10 | 2013-10-29 | International Business Machines Corporation | Collaborative software development systems and methods providing automated programming assistance |
US8714986B2 (en) | 2006-08-31 | 2014-05-06 | Achieve3000, Inc. | System and method for providing differentiated content based on skill level |
US20110081632A1 (en) | 2006-10-12 | 2011-04-07 | Wipro Limited | System and method for distributed agile |
US20080305460A1 (en) | 2006-10-12 | 2008-12-11 | Swati Garg | Training technique for learning agile methodology |
US20090138415A1 (en) | 2007-11-02 | 2009-05-28 | James Justin Lancaster | Automated research systems and methods for researching systems |
US20110270873A1 (en) * | 2007-02-05 | 2011-11-03 | Sriram Somanchi | E-learning authorship based on meta-tagged media specific learning objects |
US20120124559A1 (en) | 2007-08-21 | 2012-05-17 | Shankar Narayana Kondur | Performance Evaluation System |
US7957985B2 (en) | 2007-09-04 | 2011-06-07 | Dynamic Health Innovations, Llc | Personalized information discovery and presentation system |
JP4999791B2 (en) | 2008-06-30 | 2012-08-15 | キヤノン株式会社 | Information processing apparatus, control method thereof, and program |
US8266594B2 (en) | 2008-08-20 | 2012-09-11 | International Business Machines Corporation | System, method and program product for correcting semantic errors in code using peer submitted code snippets |
US8595638B2 (en) | 2008-08-28 | 2013-11-26 | Nokia Corporation | User interface, device and method for displaying special locations on a map |
US7685565B1 (en) | 2009-03-19 | 2010-03-23 | International Business Machines Corporation | Run time reconfiguration of computer instructions |
US8997023B2 (en) * | 2009-08-18 | 2015-03-31 | Honeywell Asca Inc. | Rapid manipulation of flowsheet configurations |
US20110136083A1 (en) | 2009-12-08 | 2011-06-09 | Microsoft Corporation | Instructional tool for teaching search skills |
US8239840B1 (en) | 2010-03-10 | 2012-08-07 | Google Inc. | Sensor simulation for mobile device applications |
US8836643B2 (en) | 2010-06-10 | 2014-09-16 | Qualcomm Incorporated | Auto-morphing adaptive user interface device and methods |
US9215471B2 (en) | 2010-11-12 | 2015-12-15 | Microsoft Technology Licensing, Llc | Bitstream manipulation and verification of encoded digital media data |
US8843892B2 (en) * | 2010-12-03 | 2014-09-23 | Adobe Systems Incorporated | Visual representations of code in application development environments |
US8676826B2 (en) | 2011-06-28 | 2014-03-18 | International Business Machines Corporation | Method, system and program storage device for automatic incremental learning of programming language grammar |
US8731454B2 (en) | 2011-11-21 | 2014-05-20 | Age Of Learning, Inc. | E-learning lesson delivery platform |
US20130311416A1 (en) | 2012-05-16 | 2013-11-21 | Xerox Corporation | Recommending training programs |
US9449527B2 (en) | 2012-08-22 | 2016-09-20 | Ca, Inc. | Break-fix simulator |
US20140113257A1 (en) | 2012-10-18 | 2014-04-24 | Alexey N. Spiridonov | Automated evaluation of programming code |
WO2014074643A2 (en) | 2012-11-06 | 2014-05-15 | Bottlenose, Inc. | System and method for dynamically placing and scheduling of promotional items or content based on momentum of activities of a targeted audience in a network environment |
US10510264B2 (en) | 2013-03-21 | 2019-12-17 | Neuron Fuel, Inc. | Systems and methods for customized lesson creation and application |
US9595205B2 (en) | 2012-12-18 | 2017-03-14 | Neuron Fuel, Inc. | Systems and methods for goal-based programming instruction |
WO2014137321A1 (en) | 2013-03-05 | 2014-09-12 | Mcafee, Inc. | Modification of application store output |
US8924926B1 (en) * | 2013-03-15 | 2014-12-30 | Google Inc. | Techniques for disambiguating unconnected components in a visual programming interface |
US8966438B2 (en) * | 2013-06-02 | 2015-02-24 | Mark Spencer Chamberlain | System and methods for end-users to graphically program and manage computers and devices |
US20140379602A1 (en) | 2013-06-25 | 2014-12-25 | Apollo Education Group, Inc. | Skill-driven education marketplace |
US20150044642A1 (en) * | 2013-08-12 | 2015-02-12 | Khan Academy | Methods and Systems for Learning Computer Programming |
US9753703B2 (en) * | 2014-02-04 | 2017-09-05 | Salesforce.Com, Inc. | Generating identifiers for user interface elements of a web page of a web application |
US9703681B2 (en) | 2014-05-29 | 2017-07-11 | Microsoft Technology Licensing, Llc | Performance optimization tip presentation during debugging |
-
2012
- 2012-12-14 US US13/715,417 patent/US9595202B2/en active Active
-
2017
- 2017-03-13 US US15/457,540 patent/US20170186337A1/en not_active Abandoned
Patent Citations (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4395236A (en) * | 1981-10-02 | 1983-07-26 | Gotthold John P | Method of generating process instructions |
US5442759A (en) * | 1992-03-26 | 1995-08-15 | International Business Machines Corporation | Interactive online tutorial system with user assistance function for software products |
US8397172B2 (en) * | 2001-08-14 | 2013-03-12 | National Instruments Corporation | Configuring a textual language program on a first device to invoke a graphical program on a second device |
US8185832B2 (en) * | 2001-08-14 | 2012-05-22 | National Instruments Corporation | Graphical deployment of a program to a device which displays the program connected to the device |
US20050120333A1 (en) * | 2002-02-18 | 2005-06-02 | Katsuro Inoue | Software component importance evaluation system |
US8467954B2 (en) * | 2002-08-05 | 2013-06-18 | Sony Corporation | Electronic guide system, contents server for electronic guide system, portable electronic guide device, and information processing method for electronic guide system |
US20060259314A1 (en) * | 2003-09-22 | 2006-11-16 | Lilach Furman | Reading device |
US20050114776A1 (en) * | 2003-10-16 | 2005-05-26 | Leapfrog Enterprises, Inc. | Tutorial apparatus |
US20080009275A1 (en) * | 2004-01-16 | 2008-01-10 | Werner Jon H | Location-aware fitness training device, methods, and program products that support real-time interactive communication and automated route generation |
US20060053372A1 (en) * | 2004-09-08 | 2006-03-09 | Transcensus, Llc | Systems and methods for teaching a person to interact with a computer program having a graphical user interface |
US20060228689A1 (en) * | 2005-04-12 | 2006-10-12 | Rajaram Kishore K | Interactive tutorial system and method |
US20080059484A1 (en) * | 2006-09-06 | 2008-03-06 | K12 Inc. | Multimedia system and method for teaching in a hybrid learning environment |
US20100041007A1 (en) * | 2008-08-13 | 2010-02-18 | Chi Wang | Method and System for Knowledge Diagnosis and Tutoring |
US20100050154A1 (en) * | 2008-08-20 | 2010-02-25 | International Business Machines Corporation | System, method and program product for guiding correction of semantic errors in code using collaboration records |
US20100099954A1 (en) * | 2008-10-22 | 2010-04-22 | Zeo, Inc. | Data-driven sleep coaching system |
US8412652B2 (en) * | 2009-03-04 | 2013-04-02 | Yahoo! Inc. | Apparatus and methods for operator training in information extraction |
US20120329030A1 (en) * | 2010-01-29 | 2012-12-27 | Dan Joseph Leininger | System and method of knowledge assessment |
US20110189645A1 (en) * | 2010-01-29 | 2011-08-04 | Daniel Leininger | System and method of knowledge assessment |
US20120040326A1 (en) * | 2010-08-12 | 2012-02-16 | Emily Larson-Rutter | Methods and systems for optimizing individualized instruction and assessment |
US20120198419A1 (en) * | 2011-02-02 | 2012-08-02 | Neill Allan W | User input auto-completion |
US20140170275A1 (en) * | 2011-06-21 | 2014-06-19 | Icookit Pty Ltd. | System For Automating Cooking Steps |
US20140289142A1 (en) * | 2012-10-31 | 2014-09-25 | Stanley Shanlin Gu | Method,Apparatus and System for Evaluating A Skill Level of A Job Seeker |
US20140173563A1 (en) * | 2012-12-19 | 2014-06-19 | Microsoft Corporation | Editor visualizations |
US9612826B2 (en) * | 2014-07-31 | 2017-04-04 | Facebook, Inc. | Attributing authorship to segments of source code |
US20160209908A1 (en) * | 2014-10-31 | 2016-07-21 | Texas State University | Cloud-based integrated system for developing and evaluating energy efficient software |
Non-Patent Citations (1)
Title |
---|
Oquvist, Martina, Nouri, Jalal, Coding by Hand or on the computer? Evaluating the effect of assessment mode on performance of students learning programming, 2018, CrossMark, 1-21 (Year: 2018) * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10276061B2 (en) | 2012-12-18 | 2019-04-30 | Neuron Fuel, Inc. | Integrated development environment for visual and text coding |
US10510264B2 (en) | 2013-03-21 | 2019-12-17 | Neuron Fuel, Inc. | Systems and methods for customized lesson creation and application |
US11158202B2 (en) | 2013-03-21 | 2021-10-26 | Neuron Fuel, Inc. | Systems and methods for customized lesson creation and application |
US11699357B2 (en) | 2020-07-07 | 2023-07-11 | Neuron Fuel, Inc. | Collaborative learning system |
Also Published As
Publication number | Publication date |
---|---|
US20140170633A1 (en) | 2014-06-19 |
US9595202B2 (en) | 2017-03-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11127311B2 (en) | Systems and methods for programming instruction | |
US20170186337A1 (en) | Programming learning center | |
US11645934B2 (en) | Systems and methods for customized lesson creation and application | |
Petzold | Microsoft XNA Framework Edition: Programming for Windows Phone 7 | |
Toyama et al. | Androidenv: A reinforcement learning platform for android | |
US20050054381A1 (en) | Proactive user interface | |
Palanisamy | Hands-On Intelligent Agents with OpenAI Gym: Your guide to developing AI agents using deep reinforcement learning | |
Armoni | Computer science concepts in scratch | |
Lukosek | Learning C# by Developing Games with Unity 5. x | |
Panëels et al. | HITPROTO: a tool for the rapid prototyping of haptic interactions for haptic data visualization | |
Zammetti | Learn Corona SDK game development | |
Lamberta et al. | Foundation HTML5 Animation with JavaScript | |
Pilone et al. | Head First IPhone and IPad Development: A Learner's Guide to Creating Objective-C Applications for the IPhone and IPad | |
Richter et al. | Mastering IOS Frameworks: Beyond the Basics | |
Huntley et al. | Game Programming for Artists | |
Richter et al. | IOS Components and Frameworks: Understanding the Advanced Features of the IOS SDK | |
Shekar et al. | Swift Game Development: Learn IOS 12 Game Development Using SpriteKit, SceneKit and ARKit 2.0 | |
Krastev et al. | Controlling a 2D computer game with a Leap Motion | |
Malankar | Learning Android Game Development | |
Egges | Swift Game Programming for Absolute Beginners | |
Watkiss | Beginning Game Programming with Pygame Zero: Coding Interactive Games on Raspberry Pi Using Python | |
Hoffman | Windows Phone 7 for IPhone Developers | |
Fernandez | Corona SDK mobile game development: Beginner's guide | |
CN116966568A (en) | Picture generation method, device, computer equipment and storage medium | |
Green et al. | Interactivity Basics |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
AS | Assignment |
Owner name: NEURON FUEL, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHONG, KELVIN VOON-KIT;MANDYAM, SRINIVAS A.;VEDATI, KRISHNA;REEL/FRAME:050332/0188 Effective date: 20121214 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |