US20110177482A1 - Facilitating targeted interaction in a networked learning environment - Google Patents

Facilitating targeted interaction in a networked learning environment Download PDF

Info

Publication number
US20110177482A1
US20110177482A1 US13/007,166 US201113007166A US2011177482A1 US 20110177482 A1 US20110177482 A1 US 20110177482A1 US 201113007166 A US201113007166 A US 201113007166A US 2011177482 A1 US2011177482 A1 US 2011177482A1
Authority
US
United States
Prior art keywords
user
learning
student
profile
users
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/007,166
Inventor
Nitzan Katz
Satish Menon
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Phoenix Inc, University of
Original Assignee
Apollo Group Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apollo Group Inc filed Critical Apollo Group Inc
Priority to US13/007,166 priority Critical patent/US20110177482A1/en
Assigned to APOLLO GROUP, INC. reassignment APOLLO GROUP, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KATZ, NITZAN, MENON, SATISH
Publication of US20110177482A1 publication Critical patent/US20110177482A1/en
Priority to US13/408,914 priority patent/US9583016B2/en
Assigned to THE UNIVERSITY OF PHOENIX, INC. reassignment THE UNIVERSITY OF PHOENIX, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: APOLLO EDUCATION GROUP, INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B7/00Electrically-operated teaching apparatus or devices working with questions and answers
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances

Definitions

  • the present invention relates to learning management systems.
  • the present invention relates to platforms for individualized learning.
  • Intelligent learning systems are systems that attempt to assist students in achieving specific learning goals. To date, these systems have mainly used a computerized teaching approach that mirrors the approach taken in brick-and-mortar classrooms. Each student is presented with the same lecture, content, and assessment, regardless of learning style, intelligence, or cognitive characteristics.
  • Online courses are examples of “containers” that may employ adaptive learning technology to achieve a specific goal.
  • the adaptive learning technology used by the container is largely self-contained. That is, the adaptive learning technology employed by a container is programmed for a singular unchanging goal associated with the container.
  • an adaptive learning tool may be designed to teach a student a course on the fundamentals of calculus.
  • the designer of the tool will assume that the student possesses the foundational knowledge of mathematics required to begin the course, but the tool may provide a certain amount of “review” information as a means of calibration.
  • the tool will not take into consideration the goals of any other course in which the student may be engaged. Instead, the tool will be designed to help the student achieve a particular level of proficiency in calculus. Once that level of efficiency is obtained by the student, the tool becomes useless. While data, such as assessment scores, may be saved, the core logic of the adaptive learning tool provides no additional benefit to the student unless the student decides to re-take the course or a portion of the course.
  • the illusion of adaptivity in “adaptive learning” tools is achieved by providing a dynamic experience for the student. This experience is based on the relationship between the assessment scores of the student and the pre-programmed hierarchy included in the tool. However, existing tools do not actually “adapt” to the student. Instead, by performing in a particular way, the student merely traverses down a pre-existing path through the tool's hierarchy.
  • FIG. 1 is a block diagram illustrating a learning management platform on which an embodiment may be implemented.
  • FIG. 2 is a block diagram illustrating a skills hierarchy structure that may be used in an embodiment.
  • FIG. 3 is a block diagram illustrating a learning object in an embodiment.
  • FIG. 4 illustrates a content feedback interface in an embodiment.
  • FIG. 5 illustrates a computer system upon which an embodiment may be implemented.
  • each learning object is associated with an individual skill and content associated with that skill, and a single course (i.e., Math 101) is comprised of many learning objects.
  • Learning objects are organized in a hierarchy that is based on the skills associated with the learning objects. Learning objects can be made to compete with one another for a spot in the hierarchy so that the “best” learning object can be recommended more often.
  • a learning management platform generates learning recommendations for students.
  • the learning management platform implements multiple learning models and instructional strategies to guide a student throughout that student's academic journey in a way consistent with the student's cognitive characteristics and other attributes relevant to learning.
  • the platform treats a student's journey as a life-long continuum and provides a set of powerful capabilities to serve the information necessary to support the student over time, even as the student's motivations and goals change.
  • the learning management platform provides the “right outcome” for each student for a broad-based curriculum, and ultimately for each student's life-long learning goals.
  • the learning management platform learns which objects are good remediators for other objects.
  • the student interacts with the learning management platform through a “learning space” platform, which may be a web-based application or an application being executed on one or more devices or computers associated with the student.
  • the learning space platform defines the student's experience, and provides feedback to the learning management platform.
  • the learning space platform makes a request to a learning experience engine.
  • the request includes student profile information.
  • the learning experience engine provides an individualized learning recommendation. For example, an English major may receive a recommendation to learn from a particular learning object associated with the skill “using prepositions properly”, using particular content associated with that object, such as a video or audio lecture associated with the skill.
  • user profiles are compared with one another to identify students that are similar to one another. These similarities may help the learning management platform decide the best recommendation for a student. For example, if a student is similar to another student that successfully learned a particular skill using a particular learning object, then the particular learning object may be recommended.
  • user profiles are be used to determine which users should interact with one another in online collaborative learning sessions.
  • Recommendations may also be based on the learning context of the student.
  • environmental and/or emotional circumstances associated with the student may alter the learning recommendation. For example, a learning object that requires a student to draw pictures may not be appropriate for a student using a mobile device with a small screen while riding a train.
  • the individualized learning recommendation is based at least in part on a learning skills hierarchy.
  • the learning skills hierarchy is a hierarchical multidimensional directed graph that has, as nodes, learning objects associated with skills, assessments, and content.
  • learning recommendation refers to information, provided to a student or a device associated with a student, which provides educational direction. For example, a user that completes the course “Math 101” may be advised by a learning recommendation to begin the course “Math 102.” Learning recommendations need not be limited to course recommendations, however. Instead, skills, content, tools, and activities may be recommended to a student with the goal of furthering the education of the student.
  • An individualized learning recommendation is a learning recommendation that is based on individual attributes of the student. For example, an individualized learning recommendation may take into account the type of learner a student classifies himself as. A student may receive recommendations to watch educational videos if the student identifies himself as a “visual” learner. However, if the same student performs poorly on assessments after studying with video content, but performs well on assessments after reading materials, then the learning platform may determine that the student's belief that he is a visual learner may be incorrect. In this case, future individualized learning recommendations may not include video content.
  • Individualized learning recommendations may be based on many types of information related to the student, such as past performance, interests, major, and various demographic information. These attributes may be compared with the attributes of other students that have had similar educational needs, and an individualized learning recommendation may be based on the success of similar students. For example, a first student may be required to learn integration as part of a course in Calculus. Different students, with similar attributes as the first student, that have successfully learned integration may provide insight into which content, tools, and activities will help the first student be successful in learning integration.
  • FIG. 1 is a block diagram illustrating a learning management platform 100 , according to an embodiment of the invention.
  • Learning management platform 100 generally represents a set of one or more computer programs, and associated resources, configured to manage educational data and information about students, provide learning recommendations to students, and use information gathered from analyzing student interaction with the system to increase the effectiveness of future learning recommendations.
  • Learning management platform 100 facilitates the delivery of information based on learning theories, models, and strategies.
  • Learning management platform 100 includes logic that facilitates communication between its various components.
  • learning management platform 100 includes a learning experience engine 110 , a learning space platform 120 , a learning content manager 130 , a learning tools manager 140 , a data analysis engine 150 , a knowledge base 160 , a personal cognitive DNA manager 170 , a skills hierarchy manager 180 , and learning system modules 190 A-C. Each of these components of platform 100 shall be described in greater detail hereafter.
  • Learning experience engine 110 generally represents a decision-making engine that interacts with all other components of learning management platform 100 and uses information gathered from these components to provide the best learning recommendation possible to students that interact with learning management platform 100 .
  • learning experience engine 110 includes learning recommendation logic configured to provide individualized learning recommendations based on information gathered from other elements of the learning management platform 100 , such as knowledge base 160 and personal cognitive DNA manager 170 .
  • learning experience engine 110 makes learning recommendations that are not based merely on traversal of a predetermined path that is based only on the student's degree program or class. Rather, learning experience engine 110 takes into consideration transient and non-transient profile attributes of each student.
  • a transient profile attribute is any attribute that changes with relatively high frequency.
  • a transient attribute may, for example, change simultaneously with environmental, physical or emotional circumstances associated with the corresponding user.
  • the current location of a student would typically constitute a transient profile attribute.
  • a non-transient profile attribute is any attribute that changes rarely, if ever.
  • Non-transient profile attributes include, for example, the birth date or home address of a student.
  • learning experience engine 110 may make use of profile attributes, consider a situation in which a student may want to learn a skill, such as how to use gerunds in a sentence, while traveling on an airplane. Learning experience engine 110 may ask for the expected arrival time to determine how much time the student has left on the plane. Then, taking into account how much time the student has left on the plane, as well as attributes such as that student's learning style and habits, as well as the type of device that the student is using, learning experience engine 110 selects appropriate content, such as audio/video content and text content, for deliver to the student's device.
  • appropriate content such as audio/video content and text content
  • the decisions made by learning experience engine 100 become more accurate. Students may receive hundreds of learning recommendations over time. As students perform activities and take an assessments associated with recommendations, data associated with each student is updated to reflect the types of activities that work well for each student, the strengths and weaknesses of the student, and other useful education-related attributes of the students.
  • education-related attributes refers to any attributes that relate to a student's learning history, goals or abilities. As shall be described in greater detail below, education-related attributes may include non-transient attributes, such as a student's prior classes and grades, and transient attributes such as a student's current mood.
  • learning experience engine 110 can provide individualized learning recommendations with a high degree of confidence in the expected success of each student. For example, it may become clear that a particular student performs poorly when he tries to learn skills using only audio content, even though that student has expressed a preference for audio content. In this case, learning experience engine 110 may subsequently require content other than audio content to be delivered to the user, instead of or in addition to audio content.
  • learning experience engine 110 communicates and shares information with other elements of the learning management platform 100 .
  • data analysis engine 150 may not be directly communicatively coupled to skills hierarchy manager 180 in an embodiment.
  • communication between data analysis engine 150 and skills hierarchy manager 180 may nevertheless be carried out using learning experience engine 110 as an intermediary in the communication operation.
  • other elements of the learning management platform 100 may be directly communicatively coupled to one another, and communication does not require the use of the learning experience engine 110 as an intermediary.
  • personal cognitive DNA manager 170 may directly communicate with knowledge base 160 .
  • Learning space platform 120 represents the user interface that the student sees when interacting with learning management platform 100 .
  • Learning space platform 120 also includes logic that is specific to the device on which learning space platform 120 resides.
  • Learning space platform 120 includes logic configured to interact with other elements of learning management platform 100 .
  • learning space platform 120 may receive a learning recommendation from learning experience engine 110 , and based on this learning recommendation, learning space platform 120 may request content from learning content manager 130 and tools from learning tools manager 140 .
  • the learning space platform 120 provides the experience recommended to complete the tasks a student needs to master in order to meet their next outcome by facilitating the delivery of learning content using appropriate tools.
  • Learning space platform 120 resides on a client computing device, in an embodiment.
  • a client computing device includes any device capable of presenting a user with learning information, such as a personal computer, mobile computing device, set-top box, or network based appliance.
  • learning space platform 120 resides on a terminal server, web server, or any other remote location that allows a user to interact with learning space platform 120 .
  • learning space platform 120 may be a web-based interface included in learning experience engine 110 .
  • Learning space platform 120 is used to make “local” decisions about the student experience, in an embodiment.
  • learning space platform 120 may be an iPhone application detects the location of the student or asks for feedback from the student, such as feedback related to the student's mood. The location data and mood data may then be used to determine the best learning experience. Local decisions may also be based on the screen size or other attributes of the device on which the learning space platform 120 resides.
  • learning experience engine 110 may deliver a variety of learning content to the student. After the content has been received, learning space platform 120 decides which content to display, and how to display it, taking into consideration screen size, stability of Internet connection, or local preferences set by the student. In an embodiment, the decisions discussed above may be made by learning experience engine 110 .
  • learning space platform 120 may provide detailed user and time-specific transient data to learning experience engine 110 , in an embodiment.
  • the current location of the user may be provided by learning space platform 120 to learning experience engine 110 , which stores the data and uses it as input for learning recommendation decisions.
  • the learning space platform 120 may communicate the current speed at which the user is moving, thereby allowing the learning experience engine 110 to make recommendations based on whether the user is stationary (e.g. at a desk), or travelling (e.g. in a car, bus or train).
  • learning experience engine 110 may refrain from sending tests to a user during periods in which the learning space platform 120 is providing information that indicates that the user is travelling.
  • a particular type of mobile computing device may not possess the ability to install a particular tool, such as a flash plug-in. Further, the device may currently be low on battery power, making a learning mode that requires less screen use more desirable, or even the only option.
  • the size of the computing device's screen may also be considered when making a content recommendation. For example, a collaborative tool, such as a chat session or shared whiteboard system, may require a larger screen to be effective, and thus may not be appropriate for a mobile computing environment.
  • the tool (e.g. a cognitive tutor) selected to deliver content to the user may or may not be aware of the student's cognitive DNA—it is the responsibility of the learning space platform 120 to launch the tool with the appropriate configurations for customization supported by the tool. Advanced tools or newly created tools on the platform may choose to use the information on the cognitive DNA in order to personalize the experience.
  • Learning content manager 130 stores and manages learning content.
  • Learning content includes any content that may be used to learn a skill. Examples of learning content include text, audio files such as mp3 files, video files such as QuickTime files, interactive flash movies, or any other type of multimedia content.
  • learning content manager 130 includes a content repository and a content categorization system for storing and organizing learning content.
  • the content repository stores content in non-volatile memory, such as a shared hard disk system.
  • the content categorization system provides content indexing services, along with an interface for creating and associating metadata with content stored in the content repository.
  • Metadata may be embedded within the content being described by the metadata, may be in a separate Meta file such as an XML file that is associated with the content being described, or may be stored in a database with an association to the content being described.
  • Learning content manager 130 also includes content delivery logic configured to manage requests for content that is stored in the content repository. For example, some content may be streamed in order to preserve bandwidth. In some cases, it makes sense to deliver all required content for a particular course at the same time, such as when the student expects to be without Internet access for an extended period of time. Thus, learning content manager 130 may be directed by learning experience engine 110 to deliver content in a particular way, depending on attributes of the student. In addition, certain content formats may not be supported by certain devices. For example, content delivery logic may choose or even change the format of the content being delivered if the device requesting the content does not support a particular format, such as the flash format.
  • the types of content and tools that may be used with learning management platform 100 are not limited to those discussed herein. Instead, the examples provided are meant to serve as possible types of content and tools that may be used, and are non-limiting examples.
  • Learning tools represent software required for delivery of learning content.
  • Learning tools may include, for example, video players, virtual whiteboard systems, video chat software, and web browsers.
  • a web browser plug-in may also be a learning tool.
  • Each of these tools may be required in order for the student to view the recommended content.
  • a recommended piece of content may consist of a flash movie.
  • a flash movie in order to be played, requires a flash player to be invoked by the learning space platform 120 running on the student's client computing system.
  • Another example of a learning tool may be a game system.
  • Learning tools manager 140 manages and organizes learning tools.
  • learning tools manager 140 includes a tool information database, a tool repository, tool selection logic, and tool delivery logic.
  • the tool information database includes information about each learning tool, such as whether or not the tool will work with a particular type of client, such as a handheld device. For example, a flash player may not work on some mobile devices.
  • Other information in the tool information database may include information, such as network location information, that enables learning space platform 120 invoke the download of a tool that is not stored in the tool repository. For example, a URL of a required tool, which may not be stored in the repository, may be provided to a student, along with a prompt to download the tool.
  • tool selection logic may assist learning space platform 120 in selecting a tool that is appropriate for a particular client device. For example, tool selection logic may determine that a particular media player, such as a video player or browser plugin, is required in order to view content that has been suggested to the student. Tool selection logic may determine that the student is using an Apple Macintosh computer, and provide the version of the tool that runs on Apple machines for download. In addition, tool selection logic may determine that no tool that plays the suggested media is available for the platform. Tool selection logic may then report this to learning experience engine 110 , which will make a new content recommendation in an embodiment.
  • a particular media player such as a video player or browser plugin
  • Tool delivery logic is configured to manage requests for tools that are stored in the tool repository. For example, a student may require a tool that takes a significant time to download. Tool delivery logic may break up the tool into smaller parts for separate download in order to ensure successful delivery of the tool in the case of a lost connection. In addition, tool delivery logic may interact with a download manager in the learning space platform 120 .
  • Data analysis engine 150 performs a detailed analysis all information gathered by other elements of the learning management platform 100 in order to identify correlations between student attributes and learning experiences. For example, changes in user profile information, assessment results, user behavior patterns, clickstream data, learning evolution information, resource monitoring information, or any other type of information available may be analyzed by data analysis engine 150 . Data need not be structured in a particular way to be analyzed, and multiple sources of data may be analyzed in real time. In addition, multiple data sources may be aggregated, even if each source provides data in a different format or structure. The aggregated data may then be filtered to provide a detailed cross platform analysis on specific data relationships.
  • Data analysis engine 150 may analyze profile information to determine groups of users that are similar to one another. In addition, data analysis engine 150 may determine the times of day, locations, and other transient attributes that are associated with a high degree of success for a student. For example, based on results of assessments taken at different times of the day, data analysis engine 150 may determine that a particular student studies more effectively between the 9 am and 11 am in the morning, and that the positive effect is magnified when the student studies at a particular bookstore in town. Any attribute may be studied by data analysis engine 150 to determine correlations between student attributes and learning effectiveness.
  • data analysis engine 150 operates in a clustered computing environment, using existing software, such as Hadoop by the Apache Hadoop project. In other embodiments, custom implementations of Hadoop or other software may be used, or a completely custom data analysis system may be used. Data analysis engine 150 includes reporting logic configured to provide detailed reports to learning experience engine 110 . These reports assist learning experience engine 110 in making learning recommendations.
  • Knowledge Base 160 manages persistent data and persistently stores snapshots of certain transient data. For example, student categorization information, student study group information, cognitive DNA relationship information, and persistent student profile information may all be stored in knowledge base 160 . Although this data is persistently stored, the data may change as required by other elements of the learning management platform 100 .
  • data analysis engine 150 may provide a report to learning experience engine 110 that causes learning experience engine 110 to indicate to knowledge base 160 , based on the report, that student categorization information for a particular student should be changed. Knowledge base 160 will then alter the persistent data to reflect the indicated change.
  • knowledge base 160 includes a relational database management system to facilitate the storage and retrieval of data. Knowledge base 160 is communicatively coupled to learning experience engine 110 , and provides learning experience engine 110 with student information to assist in creating an individualized learning recommendation.
  • Personal cognitive DNA manager 170 manages data associated with students.
  • a collection of data associated with a student is known as personal cognitive DNA (PDNA).
  • PDNA personal cognitive DNA
  • the portions of a PDNA that are stored at the personal cognitive DNA manager 170 may be transient data, while persistent portions of the PDNA may be stored in knowledge base 160 .
  • PDNA data stored in personal cognitive DNA manager 170 may also include references to persistent data stored in knowledge base 160 .
  • Personal cognitive DNA manager 170 may include a database management system, and may manage PDNA for all students.
  • instances of personal cognitive DNA manager 170 may reside on the client computing devices of students, and may be part of the learning space platform 120 .
  • PDNA for users of the client computing device or the associated learning space platform 120 may be stored in volatile or non-volatile memory.
  • a combination of these embodiments may also be used, where a portion of the personal cognitive DNA manager 170 resides on a client while another portion resides on one or more servers.
  • Personal cognitive DNA manager 170 is communicatively coupled to learning experience engine 110 , and provides learning experience engine 110 with transitory student information to assist in creating an individualized learning recommendation. For example, a user's location, local time, client device type, or client operating system may be provided to learning experience engine 110 to assist in determining what type of content is appropriate for the environment and device.
  • personal cognitive DNA manager 170 and knowledge base 160 may be combined.
  • FIG. 3 is a block diagram illustrating a learning object in an embodiment.
  • a learning object such as learning object 300 is associated with data that describes that learning object.
  • the associated data may be stored in a learning object data structure.
  • a learning object is referenced by a learning object identifier, and associated data or references to the associated data may be stored in a relational database, and may reference the identifier to indicate that the data is associated with the learning object represented by the identifier.
  • Each learning object includes a skill, such as skill 310 .
  • a skill represents an ability that a student is meant to acquire.
  • a skill may represent the ability to perform addition of single-digit numbers, form a complete sentence using a particular language, or type a certain number of words-per-minute.
  • Each learning object includes content, such as content 320 .
  • content such as content 320 .
  • the content of a learning object may include, by way of example and not by way of limitation, assessments, remediation data, skills hierarchy data, bloom level data, learning object metadata, and object-specific personalized data.
  • Content is said to be “included” as part of a learning object, even though the content may only be referenced by the learning object, but may not actually be stored within a learning object data structure.
  • Content may be stored in a content repository and managed by learning content manager 130 .
  • content is “tagged” with metadata describing the content, such as keywords, skills, associated learning objects, the types of learners (e.g. visual) that may benefit from the content, the type of content (e.g. video or text), and statistical information regarding the content usage.
  • Learning space platform 120 and learning experience engine 110 may be authorized to add, remove, or alter tags associated with content via the learning content manager 130 .
  • FIG. 2 is a block diagram illustrating a skills hierarchy structure that may be used in an embodiment.
  • Skills hierarchy 200 A includes learning objects 201 - 208 , 211 , and 214 .
  • Skills hierarchy 200 B includes learning objects 207 - 213 and 215 - 217 .
  • a skills hierarchy may represent a group of skills, a portion of a course, a course, a field of study, a certificate program, a degree program, an individual competency map that represents the skills acquired by a student, or any other education related structure.
  • Skills hierarchies may be mapped to a wide variety of various learning theories, content types, and modes.
  • Links between objects in the hierarchy represent the relationship between those objects. For example, a link between two learning objects may mean that the subject matter covered in one of the learning objects builds on the subject matter covered in the other learning object. A different link may indicate that one learning object is a prerequisite of the other.
  • skills hierarchy 200 A for example, the skill associated with learning object 205 must be acquired before advancing to the skill associated with learning object 208 .
  • the learning skills hierarchy is hierarchical.
  • a non-hierarchical approach may be used in an embodiment.
  • a non-hierarchical directed-graph approach may be used in an embodiment that is based on a different learning model.
  • Skills hierarchy manager 180 manages skills hierarchy information that describes the relationships between skills. For example, a student may be required to learn how to add and subtract before he learns how to multiply and divide.
  • a complete skills hierarchy may be made up of smaller interconnected skills hierarchies that represent smaller groups of skills, which may represent all of the skill required to complete a traditional educational course or degree program.
  • the nodes in the hierarchy correspond to learning objects. Since a single node may be considered a prerequisite for many other nodes, and many nodes may be prerequisites for a single node, the hierarchy may be multidimensional.
  • Skills hierarchy manager 180 manages the relationships between learning objects. For example, a relational database may be used to keep track of the node dependency information. Skills hierarchy manager 180 also stores object-specific data that describes skills, remediation information, assessment information, skills hierarchy association information, learning theory information, content information, tools information, and other metadata associated with learning objects. Skills hierarchy manager 180 interfaces with learning experience engine 110 and provides learning experience engine 110 with skills hierarchy data to assist learning experience engine 110 in creating an individualized learning recommendation.
  • the skills hierarchy information provided to learning experience engine 110 may include whole or partial skills hierarchies, and object-specific data may be filtered according to parameters passed to the skills hierarchy manager 180 in a request from learning experience engine 110 .
  • Learning models are created based on learning theories, and are meant to operationalize one or more learning theories. Different instructional strategies may be used, depending on the model to be implemented. Techniques include lecturing, case study, collaboration, one-on-one, direct instruction, and indirect instruction. Content and tools that facilitate the delivery of content can be used to implement instructional strategies. Examples of content include text, video, audio, and games. Examples of tools include video players, browser plug-ins, e-book readers, shared whiteboard systems, and chat systems. Many other examples of content and tools exist.
  • Learning modules can be optionally integrated into learning management platform 100 , in an embodiment.
  • Learning modules generally represent third-party applications or stand-alone applications with decision-making capabilities.
  • learning experience engine 110 may request input from a learning module such as learning module 190 A, and then take the input into consideration in making the overall learning recommendation decision.
  • learning module 190 A may recommend that a first content item be delivered to a student.
  • Learning experience engine 110 may then determine that the first content item should not be provided to the student for reasons that learning module 190 A is unable to consider, so learning experience engine 110 may recommend a similar content item, such as an item in a text format rather than an audio format.
  • Learning system modules 190 A, 190 B, and 190 C provide additional functionality to learning management platform 100 , and may be based on models other than those described herein.
  • Application Programming Interface (API) 115 provides a programming interface for learning system modules such as learning system modules 190 A, 190 B, and 190 C.
  • Learning system modules 190 A, 190 B, and 190 C are communicatively coupled to learning experience engine 110 via API 115 , and also may be communicatively coupled to other elements of learning management platform 100 , such as learning content manager 130 .
  • Learning system modules 190 A, 190 B, and 190 C may include third party or custom systems, hardware, or software modules.
  • learning models and theories may be operationalized using each student's PDNA to create an individualized learning experience for students.
  • PDNA is referred to herein as “personal cognitive DNA,” this label does not indicate that PDNA data collection and usage is limited to embodiments that are based upon cognitive models.
  • PDNA data includes information about a student's cognitive strengths and weaknesses (as well as preferences) that are provided explicitly by the student or inferred by the system as the student interacts with the system and the outcomes are measured.
  • PDNA may be used in any embodiment, independent of any particular learning model.
  • PDNA data is a collection of data associated with a student.
  • Transient profile data may be stored in the personal cognitive DNA manager 170
  • persistent profile data may be stored in knowledge base 160 .
  • PDNA data stored in personal cognitive DNA manager 170 may include references to persistent data directly or indirectly associated with the student that is stored in knowledge base 160 .
  • a rich data layer generally refers to information that is gathered and linked to create intelligence that may be used to inform learning experience engine 110 , which uses this information to generate learning recommendations.
  • the rich data layer is dynamically updating in that the data being collected changes over time, and data that does not conform to the changes becomes incorrect. For example, as a student achieves a high degree of proficiency with a particular skill, data that suggests that the student needs to become proficient with the skill becomes outdated and incorrect. Thus, the dynamic data layer must keep up with the current information available for each student.
  • Each student using learning management platform 100 is associated with PDNA for that student.
  • the PDNA for that student may contain minimal information, such as demographic information, a student's declared major, self-proclaimed learning style preferences, and imported transcript data such as grades and coursework done at other institutions.
  • data analysis engine 150 vast amounts of data may be collected and analyzed by data analysis engine 150 , resulting in new PDNA information that describes how the student learns, what level the student has achieved in a particular course, whether the student understands a particular concept or possesses a particular skill, the pace at which the student learns, or even the time of day the student is most likely to correctly answer a question.
  • the PDNA may include data that identifies the student's current location, what client computing device they are using (e.g. iPhone, laptop, or netbook), what operating system they are using, whether or not their web browser supports the Flash plug-in, or whether the student sets his status as “tired.”
  • a particular student performs differently depending on environmental factors, while another student may be capable of learning regardless of the environment. For example, one student may be able to study on a commuter train while another may not. One student may be capable of learning via an audio program while another requires text information or video.
  • certain tasks may be reserved for particular times or places in order to calibrate the system.
  • the system may be configured to only offer assessments when a student's transient PDNA data shows that the time in the student's current time zone is between certain daylight hours.
  • Metrics may be assigned to particular attributes in each student's PDNA. For example, metrics may describe expected or historical success with different learning styles. These metrics may help learning experience engine 110 determine whether the student is successful when participating in collaborative learning exercises, or whether the student would benefit more from self-study. A student may have a metric of “7” for the attribute “visual learner” and a metric of “2” for the “audio learner” attribute. Higher scores are not necessarily the only factor used in determining the learning strategy for the student, however. For example, the learning track that the student is on may actually require that the student develop listening skills. Therefore, the learning experience engine 110 may recommend a learning experience for the student that will bolster his ability to learn via audio. In other words, the system will teach the student the underlying skills required to allow the student to become a successful audio learner.
  • PDNA for a particular student may be analyzed and compared to PDNA of other students to make learning experience predictions.
  • data analysis engine 150 may analyze the PDNA information for all students in the system, over time, to predict various things, including: what method of learning is best for each student, which track will yield the highest chance of success for a given student in a particular program, which programs the student would be successful in, which courses the student may be expected to struggle with, and even which career would best suit the student.
  • checkpoints As students use learning management platform 100 to manage their learning needs, natural checkpoints emerge from the data. The frequency of these checkpoints can be adjusted by system administrators, but are generally related to a level of achievement reached by the student for which the checkpoint was generated. For example, a checkpoint may be generated for a student each time that student successfully demonstrates a skill associated with a learning object.
  • snapshots of personal cognitive DNA information are taken at each checkpoint.
  • a snapshot is a static record of PDNA as it existed at the time of the checkpoint.
  • Each snapshot is stored in a temporal database or other data storage mechanism, such as knowledge base 160 .
  • the PDNA information stored in the snapshot may be used by learning experience engine 110 to formulate recommendations for other students that may have personal cognitive DNA that is similar to the snapshot.
  • a DNA fingerprint is based on aggregate PDNA data, which may include PDNA snapshots.
  • a DNA fingerprint is made by selecting a set of PDNA data having one or more PDNA attributes in common and generating a single profile that is representative of the entire set.
  • a DNA fingerprint may be generated for students that have recently completed a learning object that teaches the calculus skill of taking the derivative of a second degree polynomial.
  • the recency of the completion of the learning object is determined based on the time that the data was stored, so snapshots that were taken at checkpoints occurring immediately after students completed the learning object will qualify for inclusion in the set of PDNA data considered for use in the creation of the DNA fingerprint.
  • Each attribute in the PDNA data considered in the creation of the DNA fingerprint may be aggregated, averaged, or otherwise considered, resulting in a fingerprint of that attribute. For example, if the average value of the attribute “abstract learning ability” is “80” in the PDNA data in the set, then the DNA fingerprint may inherit this value for the same attribute.
  • any method of considering or combining PDNA data to generate DNA fingerprint data may be used. For example, the lowest value, the median value, or a sum of the values may be used as the fingerprint value for a particular attribute. Some attributes, especially those with very little correlation to the common PDNA attribute, may not be assigned a DNA fingerprint value, or may be assigned a NULL value, indicating that conclusions about that attribute are statistically invalid for that set of PDNA. Once each attribute has been considered for the set, then the resulting values for each attribute are stored in one or more records as a DNA fingerprint for that set of PDNA data.
  • a database of snapshots and DNA fingerprints may be used to shorten learning experience engine 110 recommendation decision making time by matching a student's PDNA with a pre-existing DNA fingerprint.
  • fingerprints are based on snapshots
  • some PDNA data used to generate the matching fingerprint may have already advanced far beyond the stage when the snapshot was taken. For example, a snapshot that was taken two years ago may be used in the creation of a DNA fingerprint.
  • the attributes of the PDNA that change over time such as those that are based on location or “last learning object completed,” may be reflected in the fingerprint, but not in the current PDNA of the student or students whose data was used to create the fingerprint. Because of this, a student whose current PDNA has a high correlation with a fingerprint that is based on an old PDNA snapshot of another student may be considered similar to an “old” version of that other student.
  • fingerprint PDNA data indicates that students having a particular attribute progressed at a particular rate or excelled in a particular subject, it may be assumed that students with a matching PDNA will experience the same success.
  • students associated with the PDNA data upon which a fingerprint is based may become less and less alike, and the diversion of some attributes of the fingerprint may result in less valid prediction information if a fingerprint were to be based on the same set of students, but using snapshots associated with subsequent checkpoints.
  • recommendations that are based on a comparison between a student's current PDNA and fingerprints based on sets of PDNA data having the same values for attributes as the student's current PDNA will be highly accurate.
  • the student's PDNA may be matched to a new fingerprint whenever necessary, desired, or requested.
  • Learning recommendations generated by learning experience engine 110 are not limited to suggesting which course a student should take next.
  • a student's PDNA may be used to make very specific recommendations based on a determination that the student is cognitively similar to one or more other students.
  • the fingerprint method described above may be used, matching the student based on student grouping criteria that is stored in knowledge base 160 . In addition, any other matching technique may be used.
  • An individualized learning experience for each student may be achieved by providing real-time recommendations based on predictive information associated with cognitively similar students.
  • the second student's profile is no longer considered when selecting which learning recommendations to give to the first student.
  • a short portion of the learning experience such as learning a particular concept or skill required for a portion of a course, may be similar to the experience of others, students may complete an overall learning program of study by taking a completely unique path with respect to other students.
  • a temporal fingerprint path generally refers to a series of DNA fingerprints that are generated based on the same set of students. For example, consider a situation in which a PDNA snapshot for Lenny Learner that was taken one year ago was used in the fingerprint creation process of a particular fingerprint. All PDNA sets used in the particular fingerprint, including Lenny's, may then be analyzed for subsequent snapshot information to create a temporal fingerprint path.
  • each fingerprint that is part of a temporal fingerprint path may be based on snapshots that are not associated with the same “time.” Instead, each fingerprint may be based on a series of snapshots for each user in the set of users used to generate the fingerprint without respect to the time the snapshots were taken.
  • Lenny Learner and Laura Learner may each be associated with PDNA information used in generating a temporal fingerprint path that includes five fingerprints.
  • the snapshots associated with Lenny that are used to generate the five fingerprints may have been generated over a period of five years, starting seven years ago.
  • the snapshots associated with Laura that are used to generate the five fingerprints may have been generated over a period of two years, starting three years ago.
  • the temporal fingerprint path may be designed to represent progression through a series of learning objects, with Lenny and Laura being chosen based on their involvement with the same learning objects.
  • a temporal fingerprint path may be more strictly based on time.
  • the snapshots associated with Lenny that are used to generate the five fingerprints may have been generated over a period of exactly two years, starting seven years ago, and the snapshots associated with Laura may have been generated over a period of exactly two years, starting three years ago.
  • the length of time from the beginning of the fingerprint path is given added importance.
  • snapshots for a particular student occurring earlier than snapshots for that student that are used in generating a first fingerprint are unlikely to be used in the generation of a second fingerprint in a temporal fingerprint path.
  • a temporal fingerprint path generally implies progression, so a subsequent snapshot for that user would likely be selected to use in generation of subsequent fingerprints.
  • Temporal fingerprints are particularly useful in generating long-term learning strategies for students that are enrolled in a degree or certification program that has specific requirements. This is particularly true for students for which very little PDNA information has been gathered. For example, a student may have no history with the system, but may be enrolled in a Computer Science degree program. A temporal fingerprint path may be generated based on the set of students that have completed the Computer Science degree program. Based on the temporal fingerprint path, learning experience engine may provide an expected long-term learning strategy to the student. As more students progress through the Computer Science degree program, the set of students considered for generating a temporal fingerprint path for the program may change, altering the temporal fingerprint path. For example, the set of students considered for generating a temporal fingerprint path may be based only on students that have completed the Computer Science degree program within the last five years in an embodiment.
  • a content feedback interface may be provided with learning space platform 120 .
  • FIG. 4 illustrates a browser window 400 with a video content window 410 and a content feedback interface 420 in an embodiment. By selecting a number of “stars,” a student may indicate her overall satisfaction with the content displayed in video content window 410 .
  • the content feedback interface allows students to provide feedback about the content, and whether or not the content or the tool that enables the content was appropriate for the situation in which it was used.
  • a content feedback interface may allow users to rate content based on many factors, such as whether they found the content convenient, easy to understand, appropriate as a next step to the previous content used, or whether the student likes the content. Text entry fields and other interface elements may be used as appropriate for gathering additional feedback data from students.
  • All of the content feedback may be stored in learning content manager 130 , and used to tag content. This allows users to force content adaptation and alter the learning path of other students by increasing or decreasing the likelihood that the content will be used in a particular situation or for a student with a particular type of PDNA.
  • two different video presentations may be used to teach the same skill.
  • the first video presentation may have a higher success rate than the second video presentation
  • the second video presentation is preferred by students.
  • learning experience engine may recommend the second video presentation when the disparity between the preferences associated between the two presentations reaches a certain threshold. That is to say, when the less effective video presentation is much more preferred by the students than the first video presentation, then the second video presentation will become the default presentation recommended to students.
  • assessments such as assessments 340 are included in each learning object.
  • a student uses assessments to demonstrate the skill associated with the learning object.
  • More than one assessment may be included in a learning object.
  • Learners may be required to successfully complete all, or a subset, of the assessments in order to receive an advancement recommendation from learning experience engine 110 .
  • the number and type of assessments required may depend on historical data describing past experiences with the learning object or other related learning objects. For example, if a student is known to have difficulty with learning objects that include mathematical skills, a student may be required to successfully complete relatively more assessments for a particular learning object associated with a math skill.
  • Remediation data such as remediation data 330
  • remediation data includes detailed information describing which learning objects are preferred remediators (objects that assist in skill building) for the current learning object, or even which learning objects the current learning object is a preferred remediator for.
  • a student may experience little or no success in completing the assessments associated with learning object 212 .
  • Learning object 212 may include remediation information listing learning objects 205 and 209 as good remediators.
  • Remediation metrics may be used to help learning experience 110 engine determine which remediator to recommend. If learning object 205 has a higher remediation metric than learning object 209 , then learning object 205 may be selected, and the student will be provided content and assessments associated with learning object 205 . When the student is ready, he will advance to learning object 208 , and then make another attempt at learning object 212 .
  • Remediation information and remediation metrics may be different for different types of users, or users having particular types of PDNA.
  • learning object 205 may have a higher remediation metric than learning object 209 for group A, but learning object 205 may have a lower remediation metric than learning object 209 for group B.
  • remediation information is not limited to learning object relationships. Remedial relationships may be formed at the assessment level, even if assessments are not within the same learning object or present skills hierarchy.
  • Skills hierarchy data such as skills hierarchy data 360 describes a learning object's skills hierarchy associations. Each learning object may be included in multiple skills hierarchies. For example, learning object 207 is included in skills hierarchy 200 A and skills hierarchy 200 B. Each of skills hierarchy 200 A and skills hierarchy 200 B represent a skill set. For example, skills hierarchy 200 A and skills hierarchy 200 B may each represent a course, such as English Composition or Linear Algebra. Overlap in skills hierarchy data illustrates the multidimensional nature of the larger skills hierarchy used in learning management platform 100 .
  • Bloom level data such as bloom level data 370 may be included in a learning object.
  • Bloom level data identifies bloom taxonomy information for one or more courses in which the learning object is required.
  • the six bloom levels describe whether a user has knowledge and can remember a concept, understands and can describe and explain a concept, can apply the concept, can analyze based on or according to the concept, can evaluate based on or according to the concept, or can create based on or according to the concept.
  • a particular learning object may be associated with one bloom level for a particular skills hierarchy, but may be associated with a different bloom level for another skills hierarchy. For example, learning object 211 may be at the “evaluate” bloom level for skills hierarchy 200 A, but may be at the “apply” bloom level for skills hierarchy 200 B.
  • Metadata such as metadata 350 may include data about the learning object. For example, version information, change tracking information, or other information about the learning object may be stored in metadata 350 . Metadata 350 may also include data that should be communicated to learning space platform 120 when learning space platform 120 instantiates the learning object and associated content and tools. For example, a description or mini-syllabus describing the learning object and associated skill may be sent by learning experience engine 110 when the learning object is selected by learning space platform 120 . Additional metadata may be stored in learning object 300 . Notes, future implementation specifics, and developmental skills hierarchy identifiers are all examples of data that may be stored in metadata 350 . The amount of metadata that may be stored about an object is bound only to the system constraints or administrator discretion. Thus, metadata 350 may be considered a catch-all for other data that is to be associated with a learning object. In addition, learning objects are extendable, and may have other data categories associated with them that are not described herein.
  • Detailed personalized information may be stored on a per-person, per-object basis. For example, a particular student may have reached a certain level of understanding with respect to a particular skill that is associated with a particular object.
  • the information stored may be mapped to Bloom taxonomy levels, and may include additional student metadata that describes the particular experience that the student has had with the object. For example, metadata may describe how fast the student learned the skill, whether the student enjoyed the content used to learn the skill, and trouble-areas for the student.
  • This information may be gathered through frequent assessments, learning activities, learning games, homework assignments, and participation in group activities, in addition to other information-generating events associated with the student's interaction with learning management system 100 .
  • a skills hierarchy for a particular course syllabus such as skills hierarchy 200 B, may include multiple learning objects, such as learning objects 207 - 213 and 215 - 217 .
  • the skills hierarchy may be for a statistics course, and require all of the skills included in the associated learning objects to be acquired by the student in order to mark the course as “completed” for the student.
  • Skills hierarchy 200 B is not necessarily representative of a desired or anticipated size of a skills hierarchy for a particular course.
  • a skills hierarchy for a statistics course may include hundreds of learning objects, each directed to a granular skill or concept.
  • a student begins traversing the skills hierarchy by receiving a learning recommendation from learning experience engine 110 . If the student has not interacted with learning management platform 100 before, then the student will need to provide information to help build a PDNA. For example, the student may need to take one or more pre-assessments to determine the skill level that the student has with respect to his program. Generally, the more pre-assessments the student takes, the more accurate the initial recommendation will be. Other useful information such as transcript information from higher education institutions may also help to build a PDNA for the user.
  • Learning experience engine 110 takes into account information stored in the student's PDNA such as pre-assessment information, information stored in knowledge base 160 , and information learned from data analysis engine to determine which skills hierarchy the student should traverse, and which learning object on that skills hierarchy the student should begin with.
  • knowledge base 160 may include information about the student that indicates the student has completed the course associated with skills hierarchy 200 A, and has therefore completed learning objects 207 , 208 , and 211 . The learning recommendation may therefore not consider these learning objects as required because they have been completed in a previous course.
  • Learning experience engine 110 may therefore recommend that the student begin at learning object 209 on skills hierarchy 200 B.
  • Learning object 212 may represent the ability to calculate a confidence interval, which is used to indicate the reliability of statistical estimates. Because confidence intervals are typically expressed as a percentage, the ability to calculate percentages may be considered a prerequisite skill, identified by learning object 208 .
  • Skills hierarchy 200 B indicates that learning object 209 also represents a prerequisite skill for learning object 212 . Given this information, and the knowledge that the student has completed the course represented by skills hierarchy 200 A, it may be determined that learning object 209 is the only prerequisite required in order for the student to attempt learning object 212 , even though learning object 208 was completed as part of a separate course.
  • learning experience engine 110 may recommend that the student move to learning object 203 .
  • Learning experience engine 110 may therefore recommend that the student begin at a much higher level, even skipping levels in the skills hierarchy.
  • a student that successfully completes the most difficult assessments associated with learning object 211 with perfect accuracy may receive a recommendation to attempt assessments associated with learning object 217 in order to complete the course.
  • learning experience engine 110 may indicate that the student has finished the course, or may recommend additional content that is appropriate for the student.
  • NPV net present value
  • the present learning object for that student may be interrupted by learning space platform 120 and retrieve a recommendation from learning experience engine 110 .
  • Learning experience engine 110 may recommend that the student use a particular behaviorism-based tool associated with another learning object in order to acquire the necessary skill to calculate NPV.
  • the student successfully completes the mathematics assessment he will then be returned to the collaborative session with the other MBA students.
  • This example illustrates the platform's unifying data concepts and date driven behavior—switching between appropriate learning models (e.g. a student may have been in a social constructivist learning space working on a case study on finance when the need for remediation of a concept surfaced.
  • the next experience chosen by the platform might be an implementation of a cognitive tutor designed for math instruction.
  • learning experience engine 110 effectively created a mini-course for the student to ensure he has the skills required to be successful on his current track. This is possible partly because the learning management platform 100 keeps track of data about the students across different courses, and uses this information to help the student traverse the tree. For example, the MBA student may have already taken algebra, so a short remedial path to remind the student may have been the only information that the student needed in order to move on. However, if the student's algebra course was over two years ago, and he has taken no math since then, learning experience engine 110 may recommend a more detailed review of algebraic concepts, even taking into consideration future courses that the student is expected to take and the math skills required for those expected courses.
  • Remediation data associated with learning object 216 indicates that learning object 209 or a particular assessment or content item associated with learning object 209 is a good remediator for learning object 216 , or even a particular assessment associated with learning object 216 . If the student is directed to learning object 209 for remediation, and then attempts the assessment at learning object 216 again with success, then a remediation metric describing the association between learning object 216 and learning object 209 is changed to reflect the success. In this way, the skills hierarchy itself is dynamic and always changing based on input from the students.
  • a skills hierarchy includes many learning objects, and learning objects include or are associated with a skill and variety of content and assessments. Although two students may be said to have acquired the same skills, completing the same course, each student may have taken a different learning path.
  • student A may be a visual learner, and be cognitively similar to a set of students. These students are places into a particular group, or “tribe,” based on their cognitive similarities. In this example the tribe is referred to as tribe A.
  • Student B who is part of a different tribe, may learn well by reading text, and gain very little benefit from visual learning techniques. Based on this information, learning experience engine 110 will provide different recommendations to each student for some learning objects. For example, student A may acquire the skill associated with learning object 206 by watching a video and completing three assessments that are based on an interactive game tool. Student B, on the other hand, may acquire the same skill by reading a chapter in an e-book, and completing a single assessment requiring a writing assignment.
  • students may provide feedback by “tagging” content that they are interacting with. For example, a student may particularly enjoy an interactive learning game. The student may select a user interface element such as a button labeled with the word “fun” to indicate the preference. This information may be used in later learning recommendations for the student and other students that are cognitively similar to the student.
  • the skills hierarchy may also evolve over time.
  • learning experience engine 110 may determine that the object itself is defective or produces a sub-optimal result. For example, if a group of students that tend to perform well do not perform well on assessments associated with a particular object, then it is possible that the content used to teach the skill is mismatched with the assessments associated with the particular learning object.
  • Learning experience may generate reports that highlight these deficiencies so that course developers and content developers may change the learning object, or introduce an alternate, competitive learning object.
  • Competitive learning objects will be described in greater detail hereinafter.
  • Students, faculty, and other users of the system may interact with one another using collaborative tools and social networking features associated with the learning management platform 100 .
  • any student working on any assignment from any location at any time will be able to click a single button to inquire who else among the faculty or students is working at the same time, in the same or similar content area, and then begin to engage in questions or discussion.
  • a student that is working on a particular learning object may wish to collaborate with other students, faculty, tutors, or other users of the system working on the same learning object.
  • the student may press a button or otherwise interact with a user interface element associated with collaboration logic built into learning experience engine 110 , thereby generating a collaboration request to the learning management system.
  • the learning management system determines which other users of the system are available for collaboration with respect to the particular learning object.
  • the learning management system then returns a list of available collaborators to the student.
  • this embodiment is directed to collaboration based on learning objects, this concept may be used with any other commonality between collaborating users. For example, users may collaborate with other users of the system that have some association with a particular skill hierarchy.
  • users of the system may define preferences associated with collaborative learning. For example, students may wish to work collaboratively with other students, or may wish to only work with faculty or tutors. Additionally, students may wish to collaborate only with other students that have similar cognitive DNA. These preferences and other preferences based on any other similarities or differences between users may be stored as part of a student's cognitive DNA, and may be used to determine the makeup of a collaborative group.
  • Collaborative groups may also be limited to optimal group sizes.
  • the group sizes of the groups to which users are assigned may be based on cognitive DNA similarities between the users. For example, a particular type of student may perform better in a smaller group, while another type may perform well with larger groups.
  • Group size may also be based on the type of collaborative tool being used. For example, a shared whiteboard system may become less effective as more people attempt to draw on it.
  • a chat room associated with a video tutorial may allow for a larger number of users, which may be further based on the activity level of the chat room.
  • Students may collaborate with other users of the system that are using the same learning tools as themselves, or may collaborate with students that are using a different learning tool than they are. For example, a collaboration may involve one student writing on a white board, and another student typing in a chat room.
  • the input to the white board may be dynamically translated to text that appears in the chat room, while the text in the chat room appears on the white board.
  • a user of the system such as a faculty member or first student may provide a learning tool recommendation to a second student based on the information shared in the collaboration session.
  • Each node in a skills hierarchy has been described herein as having a single learning object. However, nodes in a skills hierarchy need not be limited to having only one associated learning object.
  • Competitive learning objects that are directed to a similar or identical skill may reside at the same node in a skills hierarchy. Each learning object may include different content and assessments than the other learning objects that occupy the node.
  • Metadata associated with each learning object of a node may indicate the success or failure of that learning object across categories. For example, one learning object may be more successful for nursing students while another learning object may be more successful for engineering students. Although different categories of students are discussed, other factors may be used to measure the success of a learning object. For example, a first learning object may be a better remediator than a second learning object for a particular node in a skills hierarchy that is associated with a different course.
  • Competing learning objects are also useful when introducing new content and skills into the learning management platform 100 . Adding a learning object as a competitive learning object rather than replacing the existing learning object allows for a trial period to determine whether the new learning object provides an improvement over the old. Integration of new objects and content may therefore be seamless. It may also turn out that the best action is to leave all of the competing learning objects in place because they each provide a different benefit that the other learning objects sharing the node space cannot.
  • the learning management platform 100 includes an Application Programming Interface (API) 115 that is configured to interact with learning system modules.
  • API 115 may be implemented over a network connection or any other communication method.
  • Learning system modules may include third-party artificial intelligence systems or other decision making recommendation, and learning systems. These learning system modules may have access to other elements of the learning management platform 100 , such as personal cognitive DNA manager 170 , or may perform independent of these other elements.
  • skills hierarchy manager 180 operates as a learning system module, and interfaces with API 115 .
  • Learning modules are not limited by learning management platform 100 , and may include additional supporting systems, hardware, networking equipment, cloud-computing systems, and external data sources. Learning modules may include any software, hardware, or network resources necessary to perform optimally.
  • a skills hierarchy-based system such as a skills hierarchy manager 180 , is not required for learning space platform 100 to function.
  • Learning system modules such as learning system modules 190 A, 190 B, and 190 C, may be configured to operate based on any model or criteria.
  • Rule-based models may include a decision-making structure that is much different than a directed graph approach, taxonomy, or the skills hierarchy described herein.
  • a learning system module configured to operate using a rule-based model may receive, as input from learning experience engine 110 , a text-based answer from a student. Based on rules within the model used by the learning system module, the module may reject the answer, and may provide associated information to learning experience engine 110 .
  • the rule may be based on linguistics or any other criteria.
  • the learning system module may detect that the text-based answer was misspelled, or that a word did not have the required number of syllables, or that the sentence or paragraph structure was incorrect.
  • a learning system module may even include rules to determine that an essay is missing a thesis sentence.
  • Learning system modules need not be tied to one particular model.
  • a rule-based model may be combined with another model, such as a directed graph-based model in order to achieve the advantages of both models.
  • no formal model is required in order to interface with learning experience engine 110 .
  • a subject-specific cognitive tutor may be developed with no regard to learning theory whatsoever, and use a completely new structure and means for decision making, and that cognitive tutor may function as a learning system module that may be “plugged-in” to learning experience engine 110 by using an interface compatible with API 115 .
  • Competing learning system modules may be used, allowing learning experience engine 110 to receive input from multiple modules, and then provide the best overall learning experience for the student.
  • an artificial intelligence based module, a taxonomy module, and a rule-based module that are all designed to teach a student to play the game of chess may be concurrently communicatively connected to API 115 .
  • Each module may store metadata associated with learning content stored in learning content manager 130 , learning tools stored or indexed in learning tools manager 140 or any other data required to provide a response to learning experience engine 110 .
  • Learning experience engine 110 may request information, data, or recommendations from each module that it then uses to provide a learning recommendation to the student.
  • Learning experience engine 110 may determine that the experience provided to the student is superior when a particular module is used during a particular portion of the learning experience. This decision may be based on any criteria. For example, the experience may be deemed superior based on the speed at which the student progresses, based on more superficial information, such as the delay incurred by using a less efficient module.
  • learning experience engine 110 is capable of concurrently receiving input from more than one module, new modules may be added to the learning management platform 100 and gradually integrated into the system. Each module may increase in importance and influence based on the merit as determined over time by learning experience engine 110 .
  • a new module may be configured by an administrator of the learning management platform 100 to be the primary module used for a particular subject, skill, or task. For example, an independently tested and proven module may be integrated into learning management platform 100 and immediately promoted as the module with the most merit, overriding any determination made by learning experience engine 110 . Learning experience engine 110 may then be configured to perform additional merit determinations for the new module, as well as existing modules.
  • learning experience engine 110 may provide more than one recommendation for the student, along with data about each recommendation such as data associated with content or learning objects. The student then may be able to choose a learning experience based on preference. Learning experience engine 110 may gather preference information for further processing by data analysis engine 150 , and use the analysis to assist learning experience engine 110 in determining future recommendations for that user or cognitively similar users.
  • learning space platform 120 may make the final decision regarding which content, assessment, or learning objects are presented to the user. This method allows local logic to offload processing of the final learning recommendation based on transitory environmental factors that are overriding, such as location, mobile network signal strength, or lighting detected by the device.
  • learning experience engine 110 may provide a group of learning objects and associated content, assessments, and tools, along with logic to allow learning space platform to determine, based on assessment performance, which of the group of learning objects should be presented next. This method allows for extended offline learning.
  • learning space platform 120 may still detect connectivity and request an additional group of learning objects based on assessment performance, advancing the state of the locally stored data in order to keep a cache of offline learning information, content, and tools available at all times.
  • a control system is included in learning management platform 100 .
  • the control system may be configured to select the next steps of the learning space navigation automatically or enable the navigation to include explicit steps set up by a faculty member, or a hybrid operation where a faculty member may decide to selectively override some of the steps of the skills hierarchy traversal.
  • learning management platform 100 includes a personalized assessment system that is capable of taking different goal sets from students, faculty, institutions, and employers, and selecting the right set of assessments to ensure that the student has mastered the right skills in-line with the goals.
  • profiles are maintained for students by personal cognitive DNA manager 170 and knowledge base 160 . These profiles may include PDNA, and describe one or more education-related attributes associated with students.
  • a profile snapshot (PS 1 T 1 ) that represents the state of the profile of a first user at a particular point in time (T 1 ) is also maintained.
  • a current profile (PS 1 T 2 ) that represents a state of the profile of the first user at a second point in time (T 2 ) is also maintained for the first user.
  • Another current profile (PS 2 T 2 ) that represents a state of the profile of a second user at a second point in time (T 2 ) is also maintained.
  • learning management platform 100 determines, based at least in part on an attribute of the profile snapshot PS 1 T 1 of the first user that is not an attribute of the current profile PS 1 T 2 of the first user, that a similarity exists between the current profile of the second user and the profile snapshot of the first user. Based on the similarity between the out of date profile (PS 1 T 1 ) of the first user and the current profile PS 2 T 2 of the second user, a content item is selected for delivery to the second user.
  • a first user may currently learn most effectively by using video-related learning tools, but that at time T 1 the first user may have learned most effectively by reading text.
  • the second user who currently learns most effectively by reading text, needs a learning tool, the current profile P 2 T 2 of the second user can be matched to a snapshot of the first user's profile P 1 T 1 (that was taken at time T 1 ) to determine a learning recommendation for the second user. For example, if at time T 1 the first user learned a concept well by reading text X, then the learning recommendation may be for the second user to read text X (even though that would not be the learning recommendation that would now be given to the first user).
  • learning objects are maintained by skills hierarchy manager 180 .
  • Profiles are maintained for students by personal cognitive DNA manager 170 and knowledge base 160 .
  • Each profile describes one or more transient attributes that change simultaneously with environmental or emotional circumstances associated with the corresponding user. For example, a student may feel tired or sad, and indicate this through a user interface provided by learning space platform 120 .
  • This information may be reported to learning experience engine 110 , which may store the information in knowledge base 160 .
  • learning space platform 120 may be executing on a mobile device with GPS (Global Positioning System) capabilities, and may report location information to learning experience engine 110 , which may store the information in knowledge base 160 .
  • GPS Global Positioning System
  • a profile may have one or more persistent attributes that describe characteristics of the student that are pertinent to educational activities, such as learning style. For example, a particular student may not learn effectively when exclusively using audio content. Although two students may have the same value for a particular transient attribute, they may have different values for a particular persistent attribute. For example, both students may be on a train (a transient attribute), but one student is unable to learn effectively using the type of content available for use while in transit (a persistent attribute) while the other student is able to learn effectively using the type of content available for use while in transit. Under these circumstances, learning experience engine 110 may decide to recommend one learning object to the student who is better able to learn on the train. The other student may receive a recommendation to wait until off the train to continue learning activities.
  • profiles are maintained for students by personal cognitive DNA manager 170 and knowledge base 160 .
  • a student sends a request for interaction with other users. For example, a particular student may want to discuss a particular educational concept with other students.
  • Learning experience engine 110 detects a group of students that are interacting with the same learning object or similar learning objects to the learning object that the particular student is interacting with.
  • Learning experience engine 110 compares the profiles of the particular user with the group of students that are interacting with similar learning objects, and determines which of those students are similar to the particular student. At least one other student is selected to interact with the particular student based on this comparison. For example, a second student may be invited to a virtual whiteboard session or live chat session with the particular student.
  • a hierarchy of learning objects is maintained by skills hierarchy manager 180 .
  • Each learning object in the hierarchy is associated with a corresponding skill and content items that help students to master the skill.
  • a particular node is occupied by two learning objects that are competing with one another to be the preferred learning object at that node.
  • Both learning objects are associated with the same skill.
  • the learning objects may be associated with different content or logic that defines different content preferences that cause different content to be delivered to different students, even though the circumstances of each student may be the same.
  • two learning objects residing at the same node in a skills hierarchy may be designed to teach the skills required to perform integration by parts, concept in calculus.
  • the first learning object may employ interactive learning games to teach the concept, while the second learning object may use a series of videos to teach the concept.
  • the first learning object may be recommended to a first student, while the second learning object may be recommended to a second student, even though the first and second student are cognitively similar with respect to personal attributes associated with mathematics.
  • a hierarchy of learning objects is maintained by skills hierarchy manager 180 .
  • Each learning object in the hierarchy is associated with a corresponding skill and one or more corresponding assessment items.
  • the assessment items measure the level of success that users attain with respect to the corresponding learning object.
  • a second learning object that is associated with a second skill is selected by learning experience engine 110 for recommendation to the first user.
  • learning experience engine 110 selects a third learning object to recommend to the second user, wherein the third learning object is associated with a third skill.
  • a first student may have performed poorly on an assessment for a learning object designed to teach the user skills related to graphing polynomial functions.
  • learning experience engine may recommend a second learning object to the first student.
  • the skill associated with the second learning object may be related to graphing functions generally.
  • the first student may also perform poorly on an assessment for the second learning object. This may be an indication that the second learning object is not a good remediator for the first learning object, given the level of performance demonstrated by the first student. Therefore, when a second student achieves the same level of performance on the assessment for the first learning object, the second student receives a recommendation for a third learning object, such as one that focuses on more general graphing skills.
  • the techniques described herein are implemented by one or more special-purpose computing devices.
  • the special-purpose computing devices may be hard-wired to perform the techniques, or may include digital electronic devices such as one or more application-specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs) that are persistently programmed to perform the techniques, or may include one or more general purpose hardware processors programmed to perform the techniques pursuant to program instructions in firmware, memory, other storage, or a combination.
  • ASICs application-specific integrated circuits
  • FPGAs field programmable gate arrays
  • Such special-purpose computing devices may also combine custom hard-wired logic, ASICs, or FPGAs with custom programming to accomplish the techniques.
  • the special-purpose computing devices may be desktop computer systems, portable computer systems, handheld devices, networking devices or any other device that incorporates hard-wired and/or program logic to implement the techniques.
  • FIG. 5 is a block diagram that illustrates a computer system 500 upon which an embodiment of the invention may be implemented.
  • Computer system 500 includes a bus 502 or other communication mechanism for communicating information, and a hardware processor 504 coupled with bus 502 for processing information.
  • Hardware processor 504 may be, for example, a general purpose microprocessor.
  • Computer system 500 also includes a main memory 506 , such as a random access memory (RAM) or other dynamic storage device, coupled to bus 502 for storing information and instructions to be executed by processor 504 .
  • Main memory 506 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 504 .
  • Such instructions when stored in non-transitory storage media accessible to processor 504 , render computer system 500 into a special-purpose machine that is customized to perform the operations specified in the instructions.
  • Computer system 500 further includes a read only memory (ROM) 508 or other static storage device coupled to bus 502 for storing static information and instructions for processor 504 .
  • ROM read only memory
  • a storage device 510 such as a magnetic disk or optical disk, is provided and coupled to bus 502 for storing information and instructions.
  • Computer system 500 may be coupled via bus 502 to a display 512 , such as a cathode ray tube (CRT), for displaying information to a computer user.
  • a display 512 such as a cathode ray tube (CRT)
  • An input device 514 is coupled to bus 502 for communicating information and command selections to processor 504 .
  • cursor control 516 is Another type of user input device
  • cursor control 516 such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 504 and for controlling cursor movement on display 512 .
  • This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), that allows the device to specify positions in a plane.
  • Computer system 500 may implement the techniques described herein using customized hard-wired logic, one or more ASICs or FPGAs, firmware and/or program logic which in combination with the computer system causes or programs computer system 500 to be a special-purpose machine. According to one embodiment, the techniques herein are performed by computer system 500 in response to processor 504 executing one or more sequences of one or more instructions contained in main memory 506 . Such instructions may be read into main memory 506 from another storage medium, such as storage device 510 . Execution of the sequences of instructions contained in main memory 506 causes processor 504 to perform the process steps described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions.
  • Non-volatile media includes, for example, optical or magnetic disks, such as storage device 510 .
  • Volatile media includes dynamic memory, such as main memory 506 .
  • Common forms of storage media include, for example, a floppy disk, a flexible disk, hard disk, solid state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge.
  • Storage media is distinct from but may be used in conjunction with transmission media.
  • Transmission media participates in transferring information between storage media.
  • transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus 502 .
  • transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.
  • Various forms of media may be involved in carrying one or more sequences of one or more instructions to processor 504 for execution.
  • the instructions may initially be carried on a magnetic disk or solid state drive of a remote computer.
  • the remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem.
  • a modem local to computer system 500 can receive the data on the telephone line and use an infra-red transmitter to convert the data to an infra-red signal.
  • An infra-red detector can receive the data carried in the infra-red signal and appropriate circuitry can place the data on bus 502 .
  • Bus 502 carries the data to main memory 506 , from which processor 504 retrieves and executes the instructions.
  • the instructions received by main memory 506 may optionally be stored on storage device 510 either before or after execution by processor 504 .
  • Computer system 500 also includes a communication interface 518 coupled to bus 502 .
  • Communication interface 518 provides a two-way data communication coupling to a network link 520 that is connected to a local network 522 .
  • communication interface 518 may be an integrated services digital network (ISDN) card, cable modem, satellite modem, or a modem to provide a data communication connection to a corresponding type of telephone line.
  • ISDN integrated services digital network
  • communication interface 518 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN.
  • LAN local area network
  • Wireless links may also be implemented.
  • communication interface 518 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.
  • Network link 520 typically provides data communication through one or more networks to other data devices.
  • network link 520 may provide a connection through local network 522 to a host computer 524 or to data equipment operated by an Internet Service Provider (ISP) 526 .
  • ISP 526 in turn provides data communication services through the world wide packet data communication network now commonly referred to as the “Internet” 528 .
  • Internet 528 uses electrical, electromagnetic or optical signals that carry digital data streams.
  • the signals through the various networks and the signals on network link 520 and through communication interface 518 which carry the digital data to and from computer system 500 , are example forms of transmission media.
  • Computer system 500 can send messages and receive data, including program code, through the network(s), network link 520 and communication interface 518 .
  • a server 530 might transmit a requested code for an application program through Internet 528 , ISP 526 , local network 522 and communication interface 518 .
  • the received code may be executed by processor 504 as it is received, and/or stored in storage device 510 , or other non-volatile storage for later execution.

Abstract

A method and apparatus for facilitating interaction in a learning environment is provided. Based on the subject matter that a student is working on, another student is selected for interaction with the student.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS; BENEFIT CLAIM
  • This application claims the benefit of both Provisional Appln. 61/295,635, filed Jan. 15, 2010 and Provisional Appln. 61/334,158, filed May 12, 2010, the entire contents of which are hereby incorporated by reference as if fully set forth herein, under 35 U.S.C. §119(e). This application is related to the following applications: (1) Appln. 12/______, entitled “DYNAMICALLY RECOMMENDING LEARNING CONTENT,” Attorney Docket Number 60201-0043, filed on even date herewith, the entire contents of which is hereby incorporated by reference as if fully set forth herein; and (2) Appln. 12/______, entitled “RECOMMENDING COMPETITIVE LEARNING OBJECTS,” Attorney Docket Number 60201-0053, filed on even date herewith, the entire contents of which is hereby incorporated by reference as if fully set forth herein. The applicants hereby rescind any disclaimer of claim scope in the related applications.
  • FIELD OF THE INVENTION
  • The present invention relates to learning management systems. In particular, the present invention relates to platforms for individualized learning.
  • BACKGROUND
  • Intelligent learning systems are systems that attempt to assist students in achieving specific learning goals. To date, these systems have mainly used a computerized teaching approach that mirrors the approach taken in brick-and-mortar classrooms. Each student is presented with the same lecture, content, and assessment, regardless of learning style, intelligence, or cognitive characteristics.
  • Advances in intelligent learning systems have been limited to approaches such as “adaptive learning.” These approaches are usually applied to logic-based topics such as mathematics, where the content that is served to each student is based on a pre-determined course-specific decision tree that is hard-coded into the system. If a first student and a second student each fail the same assessment by missing the same questions, both students will be presented with the same remedial materials as dictated by the decision tree.
  • Online courses are examples of “containers” that may employ adaptive learning technology to achieve a specific goal. For any given container, the adaptive learning technology used by the container is largely self-contained. That is, the adaptive learning technology employed by a container is programmed for a singular unchanging goal associated with the container.
  • For example, an adaptive learning tool may be designed to teach a student a course on the fundamentals of calculus. The designer of the tool will assume that the student possesses the foundational knowledge of mathematics required to begin the course, but the tool may provide a certain amount of “review” information as a means of calibration. In addition, the tool will not take into consideration the goals of any other course in which the student may be engaged. Instead, the tool will be designed to help the student achieve a particular level of proficiency in calculus. Once that level of efficiency is obtained by the student, the tool becomes useless. While data, such as assessment scores, may be saved, the core logic of the adaptive learning tool provides no additional benefit to the student unless the student decides to re-take the course or a portion of the course.
  • The illusion of adaptivity in “adaptive learning” tools is achieved by providing a dynamic experience for the student. This experience is based on the relationship between the assessment scores of the student and the pre-programmed hierarchy included in the tool. However, existing tools do not actually “adapt” to the student. Instead, by performing in a particular way, the student merely traverses down a pre-existing path through the tool's hierarchy.
  • The approaches described in this section are approaches that could be pursued, but not necessarily approaches that have been previously conceived or pursued. Therefore, unless otherwise indicated, it should not be assumed that any of the approaches described in this section qualify as prior art merely by virtue of their inclusion in this section.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements and in which:
  • FIG. 1 is a block diagram illustrating a learning management platform on which an embodiment may be implemented.
  • FIG. 2 is a block diagram illustrating a skills hierarchy structure that may be used in an embodiment.
  • FIG. 3 is a block diagram illustrating a learning object in an embodiment.
  • FIG. 4 illustrates a content feedback interface in an embodiment.
  • FIG. 5 illustrates a computer system upon which an embodiment may be implemented.
  • DETAILED DESCRIPTION
  • In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be apparent, however, that the present invention may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to avoid unnecessarily obscuring the present invention.
  • General Overview
  • It is common for students to take courses that provide too much information that the student already knows. On the other hand, it is also common for a student to complete a course in a sequence, only to be left behind in the next course in the sequence because the next course assumes knowledge that was not sufficiently covered in the earlier course. Students may also be required to take certain courses to complete a degree program, even if only a small portion of the skills learned in those courses pertain to the student's degree program. One way to ensure that students get the most out of their educational experiences is to personalize the educational experiences.
  • By using a learning management platform that employs the techniques described hereafter, students can receive personalized learning recommendations that identify specific content or learning objects that pertain to each individual student. In an embodiment, each learning object is associated with an individual skill and content associated with that skill, and a single course (i.e., Math 101) is comprised of many learning objects. Learning objects are organized in a hierarchy that is based on the skills associated with the learning objects. Learning objects can be made to compete with one another for a spot in the hierarchy so that the “best” learning object can be recommended more often.
  • In an embodiment, a learning management platform generates learning recommendations for students. The learning management platform implements multiple learning models and instructional strategies to guide a student throughout that student's academic journey in a way consistent with the student's cognitive characteristics and other attributes relevant to learning. The platform treats a student's journey as a life-long continuum and provides a set of powerful capabilities to serve the information necessary to support the student over time, even as the student's motivations and goals change. The learning management platform provides the “right outcome” for each student for a broad-based curriculum, and ultimately for each student's life-long learning goals.
  • If a student already possesses a particular skill, that student may proceed to learning a more advanced skill in the hierarchy, in an embodiment. If a student attempts to complete a learning object and does not succeed, the recommendation provided for that student includes a remedial learning objet to help prepare the student for another attempt at the first learning object. As different students attempt remediation in this way, the learning management platform learns which objects are good remediators for other objects.
  • In an embodiment, the student interacts with the learning management platform through a “learning space” platform, which may be a web-based application or an application being executed on one or more devices or computers associated with the student. The learning space platform defines the student's experience, and provides feedback to the learning management platform. The learning space platform makes a request to a learning experience engine. The request includes student profile information. Based on information associated with the student profile information, the learning experience engine provides an individualized learning recommendation. For example, an English major may receive a recommendation to learn from a particular learning object associated with the skill “using prepositions properly”, using particular content associated with that object, such as a video or audio lecture associated with the skill.
  • In an embodiment, user profiles are compared with one another to identify students that are similar to one another. These similarities may help the learning management platform decide the best recommendation for a student. For example, if a student is similar to another student that successfully learned a particular skill using a particular learning object, then the particular learning object may be recommended. In addition, user profiles are be used to determine which users should interact with one another in online collaborative learning sessions.
  • Recommendations may also be based on the learning context of the student. In an embodiment, environmental and/or emotional circumstances associated with the student may alter the learning recommendation. For example, a learning object that requires a student to draw pictures may not be appropriate for a student using a mobile device with a small screen while riding a train.
  • In another embodiment, the individualized learning recommendation is based at least in part on a learning skills hierarchy. In an embodiment, the learning skills hierarchy is a hierarchical multidimensional directed graph that has, as nodes, learning objects associated with skills, assessments, and content.
  • Individualized Learning Recommendations
  • As used herein, the term “learning recommendation” refers to information, provided to a student or a device associated with a student, which provides educational direction. For example, a user that completes the course “Math 101” may be advised by a learning recommendation to begin the course “Math 102.” Learning recommendations need not be limited to course recommendations, however. Instead, skills, content, tools, and activities may be recommended to a student with the goal of furthering the education of the student.
  • An individualized learning recommendation is a learning recommendation that is based on individual attributes of the student. For example, an individualized learning recommendation may take into account the type of learner a student classifies himself as. A student may receive recommendations to watch educational videos if the student identifies himself as a “visual” learner. However, if the same student performs poorly on assessments after studying with video content, but performs well on assessments after reading materials, then the learning platform may determine that the student's belief that he is a visual learner may be incorrect. In this case, future individualized learning recommendations may not include video content.
  • Individualized learning recommendations may be based on many types of information related to the student, such as past performance, interests, major, and various demographic information. These attributes may be compared with the attributes of other students that have had similar educational needs, and an individualized learning recommendation may be based on the success of similar students. For example, a first student may be required to learn integration as part of a course in Calculus. Different students, with similar attributes as the first student, that have successfully learned integration may provide insight into which content, tools, and activities will help the first student be successful in learning integration.
  • Structural Overview
  • FIG. 1 is a block diagram illustrating a learning management platform 100, according to an embodiment of the invention. Learning management platform 100 generally represents a set of one or more computer programs, and associated resources, configured to manage educational data and information about students, provide learning recommendations to students, and use information gathered from analyzing student interaction with the system to increase the effectiveness of future learning recommendations. Learning management platform 100 facilitates the delivery of information based on learning theories, models, and strategies. Learning management platform 100 includes logic that facilitates communication between its various components.
  • In the illustrated embodiment, learning management platform 100 includes a learning experience engine 110, a learning space platform 120, a learning content manager 130, a learning tools manager 140, a data analysis engine 150, a knowledge base 160, a personal cognitive DNA manager 170, a skills hierarchy manager 180, and learning system modules 190A-C. Each of these components of platform 100 shall be described in greater detail hereafter.
  • The Learning Experience Engine
  • Learning experience engine 110 generally represents a decision-making engine that interacts with all other components of learning management platform 100 and uses information gathered from these components to provide the best learning recommendation possible to students that interact with learning management platform 100. According to one embodiment, learning experience engine 110 includes learning recommendation logic configured to provide individualized learning recommendations based on information gathered from other elements of the learning management platform 100, such as knowledge base 160 and personal cognitive DNA manager 170.
  • Unlike conventional “adaptive learning” systems, learning experience engine 110 makes learning recommendations that are not based merely on traversal of a predetermined path that is based only on the student's degree program or class. Rather, learning experience engine 110 takes into consideration transient and non-transient profile attributes of each student. A transient profile attribute is any attribute that changes with relatively high frequency. A transient attribute may, for example, change simultaneously with environmental, physical or emotional circumstances associated with the corresponding user. Thus, the current location of a student would typically constitute a transient profile attribute. In contrast, a non-transient profile attribute is any attribute that changes rarely, if ever. Non-transient profile attributes include, for example, the birth date or home address of a student.
  • As an example of how learning experience engine 110 may make use of profile attributes, consider a situation in which a student may want to learn a skill, such as how to use gerunds in a sentence, while traveling on an airplane. Learning experience engine 110 may ask for the expected arrival time to determine how much time the student has left on the plane. Then, taking into account how much time the student has left on the plane, as well as attributes such as that student's learning style and habits, as well as the type of device that the student is using, learning experience engine 110 selects appropriate content, such as audio/video content and text content, for deliver to the student's device.
  • As the data that drives the decision-making process of the learning experience engine 110 evolves, the decisions made by learning experience engine 100 become more accurate. Students may receive hundreds of learning recommendations over time. As students perform activities and take an assessments associated with recommendations, data associated with each student is updated to reflect the types of activities that work well for each student, the strengths and weaknesses of the student, and other useful education-related attributes of the students.
  • As used herein, the term “education-related attributes” refers to any attributes that relate to a student's learning history, goals or abilities. As shall be described in greater detail below, education-related attributes may include non-transient attributes, such as a student's prior classes and grades, and transient attributes such as a student's current mood.
  • As more data is collected, patterns emerge, and learning experience engine 110 can provide individualized learning recommendations with a high degree of confidence in the expected success of each student. For example, it may become clear that a particular student performs poorly when he tries to learn skills using only audio content, even though that student has expressed a preference for audio content. In this case, learning experience engine 110 may subsequently require content other than audio content to be delivered to the user, instead of or in addition to audio content.
  • Other types of data that may be used in determining individualized learning recommendations are discussed herein. Rules defining the use of this data may be used to configure the learning experience engine 110. In addition, weights, confidence metrics, and other measurements of data importance and accuracy may be assigned to any type of data discussed herein, and these measurements may be taken into consideration in the rules that define the use of the data by learning experience engine 100.
  • In one embodiment, learning experience engine 110 communicates and shares information with other elements of the learning management platform 100. For example, data analysis engine 150 may not be directly communicatively coupled to skills hierarchy manager 180 in an embodiment. In an embodiment where data analysis engine 150 and skills hierarchy manager 180 do not directly communicate with each other, communication between data analysis engine 150 and skills hierarchy manager 180 may nevertheless be carried out using learning experience engine 110 as an intermediary in the communication operation.
  • In other embodiments, other elements of the learning management platform 100 may be directly communicatively coupled to one another, and communication does not require the use of the learning experience engine 110 as an intermediary. For example, personal cognitive DNA manager 170 may directly communicate with knowledge base 160.
  • The Learning Space Platform
  • Learning space platform 120 represents the user interface that the student sees when interacting with learning management platform 100. Learning space platform 120 also includes logic that is specific to the device on which learning space platform 120 resides. Learning space platform 120 includes logic configured to interact with other elements of learning management platform 100. For example, learning space platform 120 may receive a learning recommendation from learning experience engine 110, and based on this learning recommendation, learning space platform 120 may request content from learning content manager 130 and tools from learning tools manager 140. While the tools themselves can be “plugged into” the Learning Space using various interoperability standards in existence (such as IMS's Learning Tools Interoperability standard or the web's Open Social standard) and single sign-on techniques, the learning space platform 120 provides the experience recommended to complete the tasks a student needs to master in order to meet their next outcome by facilitating the delivery of learning content using appropriate tools.
  • Learning space platform 120 resides on a client computing device, in an embodiment. A client computing device includes any device capable of presenting a user with learning information, such as a personal computer, mobile computing device, set-top box, or network based appliance. In other embodiments, learning space platform 120 resides on a terminal server, web server, or any other remote location that allows a user to interact with learning space platform 120. For example, learning space platform 120 may be a web-based interface included in learning experience engine 110.
  • Learning space platform 120 is used to make “local” decisions about the student experience, in an embodiment. For example, learning space platform 120 may be an iPhone application detects the location of the student or asks for feedback from the student, such as feedback related to the student's mood. The location data and mood data may then be used to determine the best learning experience. Local decisions may also be based on the screen size or other attributes of the device on which the learning space platform 120 resides. For example, learning experience engine 110 may deliver a variety of learning content to the student. After the content has been received, learning space platform 120 decides which content to display, and how to display it, taking into consideration screen size, stability of Internet connection, or local preferences set by the student. In an embodiment, the decisions discussed above may be made by learning experience engine 110.
  • In addition, learning space platform 120 may provide detailed user and time-specific transient data to learning experience engine 110, in an embodiment. For example, the current location of the user may be provided by learning space platform 120 to learning experience engine 110, which stores the data and uses it as input for learning recommendation decisions. As another example, the learning space platform 120 may communicate the current speed at which the user is moving, thereby allowing the learning experience engine 110 to make recommendations based on whether the user is stationary (e.g. at a desk), or travelling (e.g. in a car, bus or train). For example, learning experience engine 110 may refrain from sending tests to a user during periods in which the learning space platform 120 is providing information that indicates that the user is travelling.
  • As additional examples, a particular type of mobile computing device may not possess the ability to install a particular tool, such as a flash plug-in. Further, the device may currently be low on battery power, making a learning mode that requires less screen use more desirable, or even the only option. The size of the computing device's screen may also be considered when making a content recommendation. For example, a collaborative tool, such as a chat session or shared whiteboard system, may require a larger screen to be effective, and thus may not be appropriate for a mobile computing environment.
  • The tool (e.g. a cognitive tutor) selected to deliver content to the user may or may not be aware of the student's cognitive DNA—it is the responsibility of the learning space platform 120 to launch the tool with the appropriate configurations for customization supported by the tool. Advanced tools or newly created tools on the platform may choose to use the information on the cognitive DNA in order to personalize the experience.
  • The Learning Content Manager
  • Learning content manager 130 stores and manages learning content. Learning content includes any content that may be used to learn a skill. Examples of learning content include text, audio files such as mp3 files, video files such as QuickTime files, interactive flash movies, or any other type of multimedia content.
  • In an embodiment, learning content manager 130 includes a content repository and a content categorization system for storing and organizing learning content. The content repository stores content in non-volatile memory, such as a shared hard disk system. The content categorization system provides content indexing services, along with an interface for creating and associating metadata with content stored in the content repository.
  • Content may be associated with metadata that describes the content. This metadata assists course developers in determining which content may be appropriate for learning particular skills. For example, metadata associated with a video may include a title attribute that includes the text “how to factor polynomials.” Other attributes may include a general category, such as “math” and a content type, such as “QuickTime video. Metadata may be embedded within the content being described by the metadata, may be in a separate Meta file such as an XML file that is associated with the content being described, or may be stored in a database with an association to the content being described.
  • Learning content manager 130 also includes content delivery logic configured to manage requests for content that is stored in the content repository. For example, some content may be streamed in order to preserve bandwidth. In some cases, it makes sense to deliver all required content for a particular course at the same time, such as when the student expects to be without Internet access for an extended period of time. Thus, learning content manager 130 may be directed by learning experience engine 110 to deliver content in a particular way, depending on attributes of the student. In addition, certain content formats may not be supported by certain devices. For example, content delivery logic may choose or even change the format of the content being delivered if the device requesting the content does not support a particular format, such as the flash format.
  • The types of content and tools that may be used with learning management platform 100 are not limited to those discussed herein. Instead, the examples provided are meant to serve as possible types of content and tools that may be used, and are non-limiting examples.
  • The Learning Tools Manager
  • Learning tools represent software required for delivery of learning content. Learning tools may include, for example, video players, virtual whiteboard systems, video chat software, and web browsers. A web browser plug-in may also be a learning tool. Each of these tools may be required in order for the student to view the recommended content. For example, a recommended piece of content may consist of a flash movie. A flash movie, in order to be played, requires a flash player to be invoked by the learning space platform 120 running on the student's client computing system. Another example of a learning tool may be a game system.
  • Learning tools manager 140 manages and organizes learning tools. In an embodiment, learning tools manager 140 includes a tool information database, a tool repository, tool selection logic, and tool delivery logic.
  • The tool information database includes information about each learning tool, such as whether or not the tool will work with a particular type of client, such as a handheld device. For example, a flash player may not work on some mobile devices. Other information in the tool information database may include information, such as network location information, that enables learning space platform 120 invoke the download of a tool that is not stored in the tool repository. For example, a URL of a required tool, which may not be stored in the repository, may be provided to a student, along with a prompt to download the tool.
  • The tool repository provides storage for downloadable tools. In an embodiment, tool selection logic may assist learning space platform 120 in selecting a tool that is appropriate for a particular client device. For example, tool selection logic may determine that a particular media player, such as a video player or browser plugin, is required in order to view content that has been suggested to the student. Tool selection logic may determine that the student is using an Apple Macintosh computer, and provide the version of the tool that runs on Apple machines for download. In addition, tool selection logic may determine that no tool that plays the suggested media is available for the platform. Tool selection logic may then report this to learning experience engine 110, which will make a new content recommendation in an embodiment.
  • Tool delivery logic is configured to manage requests for tools that are stored in the tool repository. For example, a student may require a tool that takes a significant time to download. Tool delivery logic may break up the tool into smaller parts for separate download in order to ensure successful delivery of the tool in the case of a lost connection. In addition, tool delivery logic may interact with a download manager in the learning space platform 120.
  • The Data Analysis Engine
  • Data analysis engine 150 performs a detailed analysis all information gathered by other elements of the learning management platform 100 in order to identify correlations between student attributes and learning experiences. For example, changes in user profile information, assessment results, user behavior patterns, clickstream data, learning evolution information, resource monitoring information, or any other type of information available may be analyzed by data analysis engine 150. Data need not be structured in a particular way to be analyzed, and multiple sources of data may be analyzed in real time. In addition, multiple data sources may be aggregated, even if each source provides data in a different format or structure. The aggregated data may then be filtered to provide a detailed cross platform analysis on specific data relationships.
  • Data analysis engine 150 may analyze profile information to determine groups of users that are similar to one another. In addition, data analysis engine 150 may determine the times of day, locations, and other transient attributes that are associated with a high degree of success for a student. For example, based on results of assessments taken at different times of the day, data analysis engine 150 may determine that a particular student studies more effectively between the 9 am and 11 am in the morning, and that the positive effect is magnified when the student studies at a particular bookstore in town. Any attribute may be studied by data analysis engine 150 to determine correlations between student attributes and learning effectiveness.
  • In an embodiment, data analysis engine 150 operates in a clustered computing environment, using existing software, such as Hadoop by the Apache Hadoop project. In other embodiments, custom implementations of Hadoop or other software may be used, or a completely custom data analysis system may be used. Data analysis engine 150 includes reporting logic configured to provide detailed reports to learning experience engine 110. These reports assist learning experience engine 110 in making learning recommendations.
  • The Knowledge Base
  • Knowledge Base 160 manages persistent data and persistently stores snapshots of certain transient data. For example, student categorization information, student study group information, cognitive DNA relationship information, and persistent student profile information may all be stored in knowledge base 160. Although this data is persistently stored, the data may change as required by other elements of the learning management platform 100. For example, data analysis engine 150 may provide a report to learning experience engine 110 that causes learning experience engine 110 to indicate to knowledge base 160, based on the report, that student categorization information for a particular student should be changed. Knowledge base 160 will then alter the persistent data to reflect the indicated change. In an embodiment, knowledge base 160 includes a relational database management system to facilitate the storage and retrieval of data. Knowledge base 160 is communicatively coupled to learning experience engine 110, and provides learning experience engine 110 with student information to assist in creating an individualized learning recommendation.
  • The Personal Cognitive DNA Manager
  • Personal cognitive DNA manager 170 manages data associated with students. A collection of data associated with a student is known as personal cognitive DNA (PDNA). The portions of a PDNA that are stored at the personal cognitive DNA manager 170 may be transient data, while persistent portions of the PDNA may be stored in knowledge base 160. PDNA data stored in personal cognitive DNA manager 170 may also include references to persistent data stored in knowledge base 160. Personal cognitive DNA manager 170 may include a database management system, and may manage PDNA for all students. In an embodiment, instances of personal cognitive DNA manager 170 may reside on the client computing devices of students, and may be part of the learning space platform 120. In this embodiment, PDNA for users of the client computing device or the associated learning space platform 120 may be stored in volatile or non-volatile memory. A combination of these embodiments may also be used, where a portion of the personal cognitive DNA manager 170 resides on a client while another portion resides on one or more servers. Personal cognitive DNA manager 170 is communicatively coupled to learning experience engine 110, and provides learning experience engine 110 with transitory student information to assist in creating an individualized learning recommendation. For example, a user's location, local time, client device type, or client operating system may be provided to learning experience engine 110 to assist in determining what type of content is appropriate for the environment and device. In an embodiment, personal cognitive DNA manager 170 and knowledge base 160 may be combined.
  • Learning Objects
  • FIG. 3 is a block diagram illustrating a learning object in an embodiment. A learning object such as learning object 300 is associated with data that describes that learning object. The associated data may be stored in a learning object data structure. In an embodiment, a learning object is referenced by a learning object identifier, and associated data or references to the associated data may be stored in a relational database, and may reference the identifier to indicate that the data is associated with the learning object represented by the identifier.
  • Each learning object includes a skill, such as skill 310. A skill represents an ability that a student is meant to acquire. For example, a skill may represent the ability to perform addition of single-digit numbers, form a complete sentence using a particular language, or type a certain number of words-per-minute. There is no limit to the complexity or simplicity of skills that may be included in a learning object.
  • Each learning object includes content, such as content 320. As shall be described in greater detail hereafter, the content of a learning object may include, by way of example and not by way of limitation, assessments, remediation data, skills hierarchy data, bloom level data, learning object metadata, and object-specific personalized data.
  • Content is said to be “included” as part of a learning object, even though the content may only be referenced by the learning object, but may not actually be stored within a learning object data structure. Content may be stored in a content repository and managed by learning content manager 130. In an embodiment, content is “tagged” with metadata describing the content, such as keywords, skills, associated learning objects, the types of learners (e.g. visual) that may benefit from the content, the type of content (e.g. video or text), and statistical information regarding the content usage. Learning space platform 120 and learning experience engine 110 may be authorized to add, remove, or alter tags associated with content via the learning content manager 130.
  • Skills Hierarchy
  • Recommendations generated by learning experience engine 110 are based in part on a skills hierarchy having learning objects in an embodiment. FIG. 2 is a block diagram illustrating a skills hierarchy structure that may be used in an embodiment. Skills hierarchy 200A includes learning objects 201-208, 211, and 214. Skills hierarchy 200B includes learning objects 207-213 and 215-217. A skills hierarchy may represent a group of skills, a portion of a course, a course, a field of study, a certificate program, a degree program, an individual competency map that represents the skills acquired by a student, or any other education related structure. Skills hierarchies may be mapped to a wide variety of various learning theories, content types, and modes.
  • Links between objects in the hierarchy represent the relationship between those objects. For example, a link between two learning objects may mean that the subject matter covered in one of the learning objects builds on the subject matter covered in the other learning object. A different link may indicate that one learning object is a prerequisite of the other.
  • Certain skills may be required in order to begin learning more advanced skills. According to skills hierarchy 200A for example, the skill associated with learning object 205 must be acquired before advancing to the skill associated with learning object 208. In this way, the learning skills hierarchy is hierarchical. However, a non-hierarchical approach may be used in an embodiment. For example, a non-hierarchical directed-graph approach may be used in an embodiment that is based on a different learning model.
  • The Skills Hierarchy Manager
  • Skills hierarchy manager 180 manages skills hierarchy information that describes the relationships between skills. For example, a student may be required to learn how to add and subtract before he learns how to multiply and divide. A complete skills hierarchy may be made up of smaller interconnected skills hierarchies that represent smaller groups of skills, which may represent all of the skill required to complete a traditional educational course or degree program. The nodes in the hierarchy correspond to learning objects. Since a single node may be considered a prerequisite for many other nodes, and many nodes may be prerequisites for a single node, the hierarchy may be multidimensional.
  • Skills hierarchy manager 180 manages the relationships between learning objects. For example, a relational database may be used to keep track of the node dependency information. Skills hierarchy manager 180 also stores object-specific data that describes skills, remediation information, assessment information, skills hierarchy association information, learning theory information, content information, tools information, and other metadata associated with learning objects. Skills hierarchy manager 180 interfaces with learning experience engine 110 and provides learning experience engine 110 with skills hierarchy data to assist learning experience engine 110 in creating an individualized learning recommendation. The skills hierarchy information provided to learning experience engine 110 may include whole or partial skills hierarchies, and object-specific data may be filtered according to parameters passed to the skills hierarchy manager 180 in a request from learning experience engine 110.
  • Learning Theories, Models, and Strategies
  • Learning theories address how people learn by providing a paradigm through which to view a learning objective. Although theories are abundant, three well-understood theories are behaviorism, constructivism, and cognitivism. Behaviorism is a view that is based on the assumption that people learn based on stimulation. Positive or negative reinforcement shapes the behavior of the student. Constructivism is a view that assumes that the student learns as an experience, and that the learner actually constructs information. Cognitivism is a view that assumes that people process information, and do not merely react to stimulation.
  • Learning models are created based on learning theories, and are meant to operationalize one or more learning theories. Different instructional strategies may be used, depending on the model to be implemented. Techniques include lecturing, case study, collaboration, one-on-one, direct instruction, and indirect instruction. Content and tools that facilitate the delivery of content can be used to implement instructional strategies. Examples of content include text, video, audio, and games. Examples of tools include video players, browser plug-ins, e-book readers, shared whiteboard systems, and chat systems. Many other examples of content and tools exist.
  • The theories and models together have been created to classify, adapt and guide the process of educating a student, taking into consideration the characteristics of the student that are pertinent for learning. To date, learning models have been implemented in application “silos” where each application implemented a specific model (e.g. intelligent tutors), catering to very specific topics that suited the model's flat hierarchy based algorithms, typically for logical subjects such as mathematics. These types of applications have limited use in broad-based education, such as a 4-year degree program or inter-disciplinary subjects such as Economics, and Marketing.
  • The Learning Modules
  • Learning modules can be optionally integrated into learning management platform 100, in an embodiment. Learning modules generally represent third-party applications or stand-alone applications with decision-making capabilities. In an embodiment, learning experience engine 110 may request input from a learning module such as learning module 190A, and then take the input into consideration in making the overall learning recommendation decision. For example, learning module 190A may recommend that a first content item be delivered to a student. Learning experience engine 110 may then determine that the first content item should not be provided to the student for reasons that learning module 190A is unable to consider, so learning experience engine 110 may recommend a similar content item, such as an item in a text format rather than an audio format.
  • Learning system modules 190A, 190B, and 190C provide additional functionality to learning management platform 100, and may be based on models other than those described herein. Application Programming Interface (API) 115 provides a programming interface for learning system modules such as learning system modules 190A, 190B, and 190C. Learning system modules 190A, 190B, and 190C are communicatively coupled to learning experience engine 110 via API 115, and also may be communicatively coupled to other elements of learning management platform 100, such as learning content manager 130. Learning system modules 190A, 190B, and 190C may include third party or custom systems, hardware, or software modules.
  • Personal Cognitive DNA
  • In an embodiment, learning models and theories may be operationalized using each student's PDNA to create an individualized learning experience for students. Although PDNA is referred to herein as “personal cognitive DNA,” this label does not indicate that PDNA data collection and usage is limited to embodiments that are based upon cognitive models. PDNA data includes information about a student's cognitive strengths and weaknesses (as well as preferences) that are provided explicitly by the student or inferred by the system as the student interacts with the system and the outcomes are measured.
  • PDNA may be used in any embodiment, independent of any particular learning model. PDNA data is a collection of data associated with a student. Transient profile data may be stored in the personal cognitive DNA manager 170, while persistent profile data may be stored in knowledge base 160. PDNA data stored in personal cognitive DNA manager 170 may include references to persistent data directly or indirectly associated with the student that is stored in knowledge base 160.
  • In order to provide the right experiences to the student, in addition to the proper tools and models, the system must collect and maintain a dynamically updating rich data layer to support predictive education models. A rich data layer generally refers to information that is gathered and linked to create intelligence that may be used to inform learning experience engine 110, which uses this information to generate learning recommendations. In one embodiment, the rich data layer is dynamically updating in that the data being collected changes over time, and data that does not conform to the changes becomes incorrect. For example, as a student achieves a high degree of proficiency with a particular skill, data that suggests that the student needs to become proficient with the skill becomes outdated and incorrect. Thus, the dynamic data layer must keep up with the current information available for each student.
  • Each student using learning management platform 100 is associated with PDNA for that student. When a student is new to the system, the PDNA for that student may contain minimal information, such as demographic information, a student's declared major, self-proclaimed learning style preferences, and imported transcript data such as grades and coursework done at other institutions. However, as the student begins using the learning management platform 100, vast amounts of data may be collected and analyzed by data analysis engine 150, resulting in new PDNA information that describes how the student learns, what level the student has achieved in a particular course, whether the student understands a particular concept or possesses a particular skill, the pace at which the student learns, or even the time of day the student is most likely to correctly answer a question.
  • Many of these student attributes change over time, but may still be considered persistent based on the frequency of change. For example, a student may initially be a visual learner, but may later learn more efficiently by reading text-based material. Other student attributes may be more transient in nature. For example, the PDNA may include data that identifies the student's current location, what client computing device they are using (e.g. iPhone, laptop, or netbook), what operating system they are using, whether or not their web browser supports the Flash plug-in, or whether the student sets his status as “tired.”
  • It may be the case that a particular student performs differently depending on environmental factors, while another student may be capable of learning regardless of the environment. For example, one student may be able to study on a commuter train while another may not. One student may be capable of learning via an audio program while another requires text information or video. In addition, certain tasks may be reserved for particular times or places in order to calibrate the system. For example, the system may be configured to only offer assessments when a student's transient PDNA data shows that the time in the student's current time zone is between certain daylight hours.
  • Metrics may be assigned to particular attributes in each student's PDNA. For example, metrics may describe expected or historical success with different learning styles. These metrics may help learning experience engine 110 determine whether the student is successful when participating in collaborative learning exercises, or whether the student would benefit more from self-study. A student may have a metric of “7” for the attribute “visual learner” and a metric of “2” for the “audio learner” attribute. Higher scores are not necessarily the only factor used in determining the learning strategy for the student, however. For example, the learning track that the student is on may actually require that the student develop listening skills. Therefore, the learning experience engine 110 may recommend a learning experience for the student that will bolster his ability to learn via audio. In other words, the system will teach the student the underlying skills required to allow the student to become a successful audio learner.
  • PDNA for a particular student may be analyzed and compared to PDNA of other students to make learning experience predictions. For example, data analysis engine 150 may analyze the PDNA information for all students in the system, over time, to predict various things, including: what method of learning is best for each student, which track will yield the highest chance of success for a given student in a particular program, which programs the student would be successful in, which courses the student may be expected to struggle with, and even which career would best suit the student.
  • Snapshots and Learning Intervals
  • As students use learning management platform 100 to manage their learning needs, natural checkpoints emerge from the data. The frequency of these checkpoints can be adjusted by system administrators, but are generally related to a level of achievement reached by the student for which the checkpoint was generated. For example, a checkpoint may be generated for a student each time that student successfully demonstrates a skill associated with a learning object.
  • In an embodiment, snapshots of personal cognitive DNA information are taken at each checkpoint. A snapshot is a static record of PDNA as it existed at the time of the checkpoint. Each snapshot is stored in a temporal database or other data storage mechanism, such as knowledge base 160.
  • Since a new snapshot can be taken for each learning interval (the time between checkpoints), the PDNA information stored in the snapshot may be used by learning experience engine 110 to formulate recommendations for other students that may have personal cognitive DNA that is similar to the snapshot.
  • DNA Fingerprints
  • An analysis of hundreds, thousands, or even larger numbers PDNA data sets yields statistically valid cognitive DNA “fingerprints”, in an embodiment. A DNA fingerprint is based on aggregate PDNA data, which may include PDNA snapshots. Generally, a DNA fingerprint is made by selecting a set of PDNA data having one or more PDNA attributes in common and generating a single profile that is representative of the entire set.
  • For example, a DNA fingerprint may be generated for students that have recently completed a learning object that teaches the calculus skill of taking the derivative of a second degree polynomial. The recency of the completion of the learning object is determined based on the time that the data was stored, so snapshots that were taken at checkpoints occurring immediately after students completed the learning object will qualify for inclusion in the set of PDNA data considered for use in the creation of the DNA fingerprint. Each attribute in the PDNA data considered in the creation of the DNA fingerprint may be aggregated, averaged, or otherwise considered, resulting in a fingerprint of that attribute. For example, if the average value of the attribute “abstract learning ability” is “80” in the PDNA data in the set, then the DNA fingerprint may inherit this value for the same attribute. Any method of considering or combining PDNA data to generate DNA fingerprint data may be used. For example, the lowest value, the median value, or a sum of the values may be used as the fingerprint value for a particular attribute. Some attributes, especially those with very little correlation to the common PDNA attribute, may not be assigned a DNA fingerprint value, or may be assigned a NULL value, indicating that conclusions about that attribute are statistically invalid for that set of PDNA. Once each attribute has been considered for the set, then the resulting values for each attribute are stored in one or more records as a DNA fingerprint for that set of PDNA data.
  • Using Snapshots and DNA Fingerprints to Make Learning Recommendations
  • A database of snapshots and DNA fingerprints may be used to shorten learning experience engine 110 recommendation decision making time by matching a student's PDNA with a pre-existing DNA fingerprint.
  • Because fingerprints are based on snapshots, some PDNA data used to generate the matching fingerprint may have already advanced far beyond the stage when the snapshot was taken. For example, a snapshot that was taken two years ago may be used in the creation of a DNA fingerprint. The attributes of the PDNA that change over time, such as those that are based on location or “last learning object completed,” may be reflected in the fingerprint, but not in the current PDNA of the student or students whose data was used to create the fingerprint. Because of this, a student whose current PDNA has a high correlation with a fingerprint that is based on an old PDNA snapshot of another student may be considered similar to an “old” version of that other student.
  • If fingerprint PDNA data indicates that students having a particular attribute progressed at a particular rate or excelled in a particular subject, it may be assumed that students with a matching PDNA will experience the same success. As time goes on, students associated with the PDNA data upon which a fingerprint is based may become less and less alike, and the diversion of some attributes of the fingerprint may result in less valid prediction information if a fingerprint were to be based on the same set of students, but using snapshots associated with subsequent checkpoints. However, recommendations that are based on a comparison between a student's current PDNA and fingerprints based on sets of PDNA data having the same values for attributes as the student's current PDNA will be highly accurate. The student's PDNA may be matched to a new fingerprint whenever necessary, desired, or requested.
  • Learning recommendations generated by learning experience engine 110 are not limited to suggesting which course a student should take next. A student's PDNA may be used to make very specific recommendations based on a determination that the student is cognitively similar to one or more other students. The fingerprint method described above may be used, matching the student based on student grouping criteria that is stored in knowledge base 160. In addition, any other matching technique may be used.
  • An individualized learning experience for each student may be achieved by providing real-time recommendations based on predictive information associated with cognitively similar students. When a first student is no longer cognitively similar to a second student, the second student's profile is no longer considered when selecting which learning recommendations to give to the first student. Thus, while a short portion of the learning experience, such as learning a particular concept or skill required for a portion of a course, may be similar to the experience of others, students may complete an overall learning program of study by taking a completely unique path with respect to other students.
  • Temporal Fingerprint Paths
  • A temporal fingerprint path generally refers to a series of DNA fingerprints that are generated based on the same set of students. For example, consider a situation in which a PDNA snapshot for Lenny Learner that was taken one year ago was used in the fingerprint creation process of a particular fingerprint. All PDNA sets used in the particular fingerprint, including Lenny's, may then be analyzed for subsequent snapshot information to create a temporal fingerprint path.
  • In an embodiment, each fingerprint that is part of a temporal fingerprint path may be based on snapshots that are not associated with the same “time.” Instead, each fingerprint may be based on a series of snapshots for each user in the set of users used to generate the fingerprint without respect to the time the snapshots were taken. For example, Lenny Learner and Laura Learner may each be associated with PDNA information used in generating a temporal fingerprint path that includes five fingerprints. The snapshots associated with Lenny that are used to generate the five fingerprints may have been generated over a period of five years, starting seven years ago. The snapshots associated with Laura that are used to generate the five fingerprints may have been generated over a period of two years, starting three years ago. In this embodiment, the temporal fingerprint path may be designed to represent progression through a series of learning objects, with Lenny and Laura being chosen based on their involvement with the same learning objects.
  • In another embodiment, a temporal fingerprint path may be more strictly based on time. For example, the snapshots associated with Lenny that are used to generate the five fingerprints may have been generated over a period of exactly two years, starting seven years ago, and the snapshots associated with Laura may have been generated over a period of exactly two years, starting three years ago. Thus, the length of time from the beginning of the fingerprint path is given added importance.
  • However, snapshots for a particular student occurring earlier than snapshots for that student that are used in generating a first fingerprint are unlikely to be used in the generation of a second fingerprint in a temporal fingerprint path. A temporal fingerprint path generally implies progression, so a subsequent snapshot for that user would likely be selected to use in generation of subsequent fingerprints.
  • Temporal fingerprints are particularly useful in generating long-term learning strategies for students that are enrolled in a degree or certification program that has specific requirements. This is particularly true for students for which very little PDNA information has been gathered. For example, a student may have no history with the system, but may be enrolled in a Computer Science degree program. A temporal fingerprint path may be generated based on the set of students that have completed the Computer Science degree program. Based on the temporal fingerprint path, learning experience engine may provide an expected long-term learning strategy to the student. As more students progress through the Computer Science degree program, the set of students considered for generating a temporal fingerprint path for the program may change, altering the temporal fingerprint path. For example, the set of students considered for generating a temporal fingerprint path may be based only on students that have completed the Computer Science degree program within the last five years in an embodiment.
  • Other factors may also be considered when selecting the set of students that are used to generate a temporal fingerprint path. For example, degree programs change over time, and today's students may need to acquire different skills than yesterday's students to complete the Computer Science degree program. Thus, in another embodiment, only students in the same “version” of the Computer Science degree program are considered when creating a temporal fingerprint path for a particular student.
  • Content Feedback Interface
  • A content feedback interface may be provided with learning space platform 120. FIG. 4 illustrates a browser window 400 with a video content window 410 and a content feedback interface 420 in an embodiment. By selecting a number of “stars,” a student may indicate her overall satisfaction with the content displayed in video content window 410. The content feedback interface allows students to provide feedback about the content, and whether or not the content or the tool that enables the content was appropriate for the situation in which it was used.
  • In addition to the features described with respect to FIG. 4, a content feedback interface may allow users to rate content based on many factors, such as whether they found the content convenient, easy to understand, appropriate as a next step to the previous content used, or whether the student likes the content. Text entry fields and other interface elements may be used as appropriate for gathering additional feedback data from students.
  • All of the content feedback may be stored in learning content manager 130, and used to tag content. This allows users to force content adaptation and alter the learning path of other students by increasing or decreasing the likelihood that the content will be used in a particular situation or for a student with a particular type of PDNA.
  • For example, two different video presentations may be used to teach the same skill. Although the first video presentation may have a higher success rate than the second video presentation, the second video presentation is preferred by students. Taking student preferences into consideration, learning experience engine may recommend the second video presentation when the disparity between the preferences associated between the two presentations reaches a certain threshold. That is to say, when the less effective video presentation is much more preferred by the students than the first video presentation, then the second video presentation will become the default presentation recommended to students.
  • Assessments
  • In one embodiment, assessments, such as assessments 340 are included in each learning object. A student uses assessments to demonstrate the skill associated with the learning object. More than one assessment may be included in a learning object. Learners may be required to successfully complete all, or a subset, of the assessments in order to receive an advancement recommendation from learning experience engine 110. The number and type of assessments required may depend on historical data describing past experiences with the learning object or other related learning objects. For example, if a student is known to have difficulty with learning objects that include mathematical skills, a student may be required to successfully complete relatively more assessments for a particular learning object associated with a math skill.
  • Remediation
  • Remediation data, such as remediation data 330, may also be included in a learning object. In an embodiment, remediation data includes detailed information describing which learning objects are preferred remediators (objects that assist in skill building) for the current learning object, or even which learning objects the current learning object is a preferred remediator for. For example, a student may experience little or no success in completing the assessments associated with learning object 212. Learning object 212 may include remediation information listing learning objects 205 and 209 as good remediators. Remediation metrics may be used to help learning experience 110 engine determine which remediator to recommend. If learning object 205 has a higher remediation metric than learning object 209, then learning object 205 may be selected, and the student will be provided content and assessments associated with learning object 205. When the student is ready, he will advance to learning object 208, and then make another attempt at learning object 212.
  • Remediation information and remediation metrics may be different for different types of users, or users having particular types of PDNA. For example, learning object 205 may have a higher remediation metric than learning object 209 for group A, but learning object 205 may have a lower remediation metric than learning object 209 for group B. In addition, remediation information is not limited to learning object relationships. Remedial relationships may be formed at the assessment level, even if assessments are not within the same learning object or present skills hierarchy.
  • Skills Hierarchy Data
  • Skills hierarchy data such as skills hierarchy data 360 describes a learning object's skills hierarchy associations. Each learning object may be included in multiple skills hierarchies. For example, learning object 207 is included in skills hierarchy 200A and skills hierarchy 200B. Each of skills hierarchy 200A and skills hierarchy 200B represent a skill set. For example, skills hierarchy 200A and skills hierarchy 200B may each represent a course, such as English Composition or Linear Algebra. Overlap in skills hierarchy data illustrates the multidimensional nature of the larger skills hierarchy used in learning management platform 100.
  • Bloom Level Data
  • Bloom level data such as bloom level data 370 may be included in a learning object. Bloom level data identifies bloom taxonomy information for one or more courses in which the learning object is required. The six bloom levels describe whether a user has knowledge and can remember a concept, understands and can describe and explain a concept, can apply the concept, can analyze based on or according to the concept, can evaluate based on or according to the concept, or can create based on or according to the concept. A particular learning object may be associated with one bloom level for a particular skills hierarchy, but may be associated with a different bloom level for another skills hierarchy. For example, learning object 211 may be at the “evaluate” bloom level for skills hierarchy 200A, but may be at the “apply” bloom level for skills hierarchy 200B.
  • Learning Object Metadata
  • Metadata such as metadata 350 may include data about the learning object. For example, version information, change tracking information, or other information about the learning object may be stored in metadata 350. Metadata 350 may also include data that should be communicated to learning space platform 120 when learning space platform 120 instantiates the learning object and associated content and tools. For example, a description or mini-syllabus describing the learning object and associated skill may be sent by learning experience engine 110 when the learning object is selected by learning space platform 120. Additional metadata may be stored in learning object 300. Notes, future implementation specifics, and developmental skills hierarchy identifiers are all examples of data that may be stored in metadata 350. The amount of metadata that may be stored about an object is bound only to the system constraints or administrator discretion. Thus, metadata 350 may be considered a catch-all for other data that is to be associated with a learning object. In addition, learning objects are extendable, and may have other data categories associated with them that are not described herein.
  • Object-Specific Personalized Data
  • Detailed personalized information may be stored on a per-person, per-object basis. For example, a particular student may have reached a certain level of understanding with respect to a particular skill that is associated with a particular object. The information stored may be mapped to Bloom taxonomy levels, and may include additional student metadata that describes the particular experience that the student has had with the object. For example, metadata may describe how fast the student learned the skill, whether the student enjoyed the content used to learn the skill, and trouble-areas for the student. This information may be gathered through frequent assessments, learning activities, learning games, homework assignments, and participation in group activities, in addition to other information-generating events associated with the student's interaction with learning management system 100.
  • Traversing the Skills Hierarchy
  • The phrase “traversing the skills hierarchy” refers to a student's progression through the learning objects in the skills hierarchy. A skills hierarchy for a particular course syllabus, such as skills hierarchy 200B, may include multiple learning objects, such as learning objects 207-213 and 215-217. The skills hierarchy may be for a statistics course, and require all of the skills included in the associated learning objects to be acquired by the student in order to mark the course as “completed” for the student. Skills hierarchy 200B is not necessarily representative of a desired or anticipated size of a skills hierarchy for a particular course. For example, a skills hierarchy for a statistics course may include hundreds of learning objects, each directed to a granular skill or concept.
  • A student begins traversing the skills hierarchy by receiving a learning recommendation from learning experience engine 110. If the student has not interacted with learning management platform 100 before, then the student will need to provide information to help build a PDNA. For example, the student may need to take one or more pre-assessments to determine the skill level that the student has with respect to his program. Generally, the more pre-assessments the student takes, the more accurate the initial recommendation will be. Other useful information such as transcript information from higher education institutions may also help to build a PDNA for the user.
  • Learning experience engine 110 takes into account information stored in the student's PDNA such as pre-assessment information, information stored in knowledge base 160, and information learned from data analysis engine to determine which skills hierarchy the student should traverse, and which learning object on that skills hierarchy the student should begin with. For example, knowledge base 160 may include information about the student that indicates the student has completed the course associated with skills hierarchy 200A, and has therefore completed learning objects 207, 208, and 211. The learning recommendation may therefore not consider these learning objects as required because they have been completed in a previous course. Learning experience engine 110 may therefore recommend that the student begin at learning object 209 on skills hierarchy 200B.
  • Learning object 212 may represent the ability to calculate a confidence interval, which is used to indicate the reliability of statistical estimates. Because confidence intervals are typically expressed as a percentage, the ability to calculate percentages may be considered a prerequisite skill, identified by learning object 208. Skills hierarchy 200B indicates that learning object 209 also represents a prerequisite skill for learning object 212. Given this information, and the knowledge that the student has completed the course represented by skills hierarchy 200A, it may be determined that learning object 209 is the only prerequisite required in order for the student to attempt learning object 212, even though learning object 208 was completed as part of a separate course.
  • In an embodiment, when a student attempts to complete a learning object, but experiences little or no success completing assessments associated with that learning object, the student will traverse the skills hierarchy toward a remedial learning object. For example, if a student completes the assessments at learning object 212 with a small metric of success, such as answering only 20% of questions correctly or completing tasks incorrectly, learning experience engine 110 may recommend that the student move to learning object 203. Alternatively, if a student completes the most difficult or complicated assessments associated with a learning object with complete accuracy, it is possible that the system is unaware of skills possessed by the student. Learning experience engine 110 may therefore recommend that the student begin at a much higher level, even skipping levels in the skills hierarchy. For example, a student that successfully completes the most difficult assessments associated with learning object 211 with perfect accuracy may receive a recommendation to attempt assessments associated with learning object 217 in order to complete the course. Based on the assessment information, learning experience engine 110 may indicate that the student has finished the course, or may recommend additional content that is appropriate for the student.
  • It is not necessary that the types of tools or content used for remediation be similar to the tools and content used for the skill that required remedial training. Furthermore, it is not the case that the course that includes the skill requiring remedial training be in a similar category to the learning object required for remediation. For example, a group of business students in an MBA business development course may be working on a learning object using a collaborative tool that includes a shared whiteboard, a chat session, and a voice over Internet protocol connection. During the collaborative session, however, the students may be required to calculate the net present value (NPV) of a company, which is the total present value of a time series of cash flows. Cash flows must be discounted to a present value, and then summed together. If a particular student has difficulty making the calculation, the present learning object for that student may be interrupted by learning space platform 120 and retrieve a recommendation from learning experience engine 110. Learning experience engine 110 may recommend that the student use a particular behaviorism-based tool associated with another learning object in order to acquire the necessary skill to calculate NPV. Once the student successfully completes the mathematics assessment, he will then be returned to the collaborative session with the other MBA students. This example illustrates the platform's unifying data concepts and date driven behavior—switching between appropriate learning models (e.g. a student may have been in a social constructivist learning space working on a case study on finance when the need for remediation of a concept surfaced. The next experience chosen by the platform might be an implementation of a cognitive tutor designed for math instruction.
  • In the previous example, learning experience engine 110 effectively created a mini-course for the student to ensure he has the skills required to be successful on his current track. This is possible partly because the learning management platform 100 keeps track of data about the students across different courses, and uses this information to help the student traverse the tree. For example, the MBA student may have already taken algebra, so a short remedial path to remind the student may have been the only information that the student needed in order to move on. However, if the student's algebra course was over two years ago, and he has taken no math since then, learning experience engine 110 may recommend a more detailed review of algebraic concepts, even taking into consideration future courses that the student is expected to take and the math skills required for those expected courses.
  • While the student is traversing the tree, content, tools, and learning object metadata are changed to reflect the success of the path taken by the student. For example, a student may perform poorly on one or more assessments associated with learning object 216. Remediation data associated with learning object 216 indicates that learning object 209 or a particular assessment or content item associated with learning object 209 is a good remediator for learning object 216, or even a particular assessment associated with learning object 216. If the student is directed to learning object 209 for remediation, and then attempts the assessment at learning object 216 again with success, then a remediation metric describing the association between learning object 216 and learning object 209 is changed to reflect the success. In this way, the skills hierarchy itself is dynamic and always changing based on input from the students.
  • The distinct paths that a student may take reside in a multidimensional learning space associated with a skills hierarchy. The path that a student may take through a multidimensional learning space should provide the most efficient and best outcome for a student. A skills hierarchy includes many learning objects, and learning objects include or are associated with a skill and variety of content and assessments. Although two students may be said to have acquired the same skills, completing the same course, each student may have taken a different learning path.
  • For example, student A may be a visual learner, and be cognitively similar to a set of students. These students are places into a particular group, or “tribe,” based on their cognitive similarities. In this example the tribe is referred to as tribe A. Student B, who is part of a different tribe, may learn well by reading text, and gain very little benefit from visual learning techniques. Based on this information, learning experience engine 110 will provide different recommendations to each student for some learning objects. For example, student A may acquire the skill associated with learning object 206 by watching a video and completing three assessments that are based on an interactive game tool. Student B, on the other hand, may acquire the same skill by reading a chapter in an e-book, and completing a single assessment requiring a writing assignment.
  • While traversing the skills hierarchy, students may provide feedback by “tagging” content that they are interacting with. For example, a student may particularly enjoy an interactive learning game. The student may select a user interface element such as a button labeled with the word “fun” to indicate the preference. This information may be used in later learning recommendations for the student and other students that are cognitively similar to the student.
  • The skills hierarchy may also evolve over time. By analyzing data associated with a particular learning object, such as data the shows how well students in different categories perform with respect to the learning object, learning experience engine 110 may determine that the object itself is defective or produces a sub-optimal result. For example, if a group of students that tend to perform well do not perform well on assessments associated with a particular object, then it is possible that the content used to teach the skill is mismatched with the assessments associated with the particular learning object. Learning experience may generate reports that highlight these deficiencies so that course developers and content developers may change the learning object, or introduce an alternate, competitive learning object. Competitive learning objects will be described in greater detail hereinafter.
  • Targeted Interaction
  • Students, faculty, and other users of the system may interact with one another using collaborative tools and social networking features associated with the learning management platform 100. In an embodiment, any student working on any assignment from any location at any time will be able to click a single button to inquire who else among the faculty or students is working at the same time, in the same or similar content area, and then begin to engage in questions or discussion. For example, a student that is working on a particular learning object may wish to collaborate with other students, faculty, tutors, or other users of the system working on the same learning object.
  • In an embodiment, the student may press a button or otherwise interact with a user interface element associated with collaboration logic built into learning experience engine 110, thereby generating a collaboration request to the learning management system. In response to receiving the request, the learning management system determines which other users of the system are available for collaboration with respect to the particular learning object. The learning management system then returns a list of available collaborators to the student. Although this embodiment is directed to collaboration based on learning objects, this concept may be used with any other commonality between collaborating users. For example, users may collaborate with other users of the system that have some association with a particular skill hierarchy.
  • In an embodiment, users of the system may define preferences associated with collaborative learning. For example, students may wish to work collaboratively with other students, or may wish to only work with faculty or tutors. Additionally, students may wish to collaborate only with other students that have similar cognitive DNA. These preferences and other preferences based on any other similarities or differences between users may be stored as part of a student's cognitive DNA, and may be used to determine the makeup of a collaborative group.
  • Collaborative groups may also be limited to optimal group sizes. In an embodiment, the group sizes of the groups to which users are assigned may be based on cognitive DNA similarities between the users. For example, a particular type of student may perform better in a smaller group, while another type may perform well with larger groups. Group size may also be based on the type of collaborative tool being used. For example, a shared whiteboard system may become less effective as more people attempt to draw on it. However, a chat room associated with a video tutorial may allow for a larger number of users, which may be further based on the activity level of the chat room.
  • Students may collaborate with other users of the system that are using the same learning tools as themselves, or may collaborate with students that are using a different learning tool than they are. For example, a collaboration may involve one student writing on a white board, and another student typing in a chat room. The input to the white board may be dynamically translated to text that appears in the chat room, while the text in the chat room appears on the white board.
  • In an embodiment, a user of the system, such as a faculty member or first student may provide a learning tool recommendation to a second student based on the information shared in the collaboration session.
  • Competitive Learning Objects
  • Each node in a skills hierarchy has been described herein as having a single learning object. However, nodes in a skills hierarchy need not be limited to having only one associated learning object. Competitive learning objects that are directed to a similar or identical skill may reside at the same node in a skills hierarchy. Each learning object may include different content and assessments than the other learning objects that occupy the node.
  • Metadata associated with each learning object of a node may indicate the success or failure of that learning object across categories. For example, one learning object may be more successful for nursing students while another learning object may be more successful for engineering students. Although different categories of students are discussed, other factors may be used to measure the success of a learning object. For example, a first learning object may be a better remediator than a second learning object for a particular node in a skills hierarchy that is associated with a different course.
  • Competing learning objects are also useful when introducing new content and skills into the learning management platform 100. Adding a learning object as a competitive learning object rather than replacing the existing learning object allows for a trial period to determine whether the new learning object provides an improvement over the old. Integration of new objects and content may therefore be seamless. It may also turn out that the best action is to leave all of the competing learning objects in place because they each provide a different benefit that the other learning objects sharing the node space cannot.
  • Learning System Modules
  • In the illustrated embodiment, the learning management platform 100 includes an Application Programming Interface (API) 115 that is configured to interact with learning system modules. API 115 may be implemented over a network connection or any other communication method. Learning system modules may include third-party artificial intelligence systems or other decision making recommendation, and learning systems. These learning system modules may have access to other elements of the learning management platform 100, such as personal cognitive DNA manager 170, or may perform independent of these other elements. In an embodiment, skills hierarchy manager 180 operates as a learning system module, and interfaces with API 115. Learning modules are not limited by learning management platform 100, and may include additional supporting systems, hardware, networking equipment, cloud-computing systems, and external data sources. Learning modules may include any software, hardware, or network resources necessary to perform optimally.
  • Although a skills hierarchy-based system has been presented, such a system, which may include skills hierarchy manager 180, is not required for learning space platform 100 to function. Learning system modules such as learning system modules 190A, 190B, and 190C, may be configured to operate based on any model or criteria. Rule-based models, for example, may include a decision-making structure that is much different than a directed graph approach, taxonomy, or the skills hierarchy described herein. For example, a learning system module configured to operate using a rule-based model may receive, as input from learning experience engine 110, a text-based answer from a student. Based on rules within the model used by the learning system module, the module may reject the answer, and may provide associated information to learning experience engine 110. The rule may be based on linguistics or any other criteria. For example, the learning system module may detect that the text-based answer was misspelled, or that a word did not have the required number of syllables, or that the sentence or paragraph structure was incorrect. A learning system module may even include rules to determine that an essay is missing a thesis sentence.
  • Learning system modules need not be tied to one particular model. For example, a rule-based model may be combined with another model, such as a directed graph-based model in order to achieve the advantages of both models. In addition, no formal model is required in order to interface with learning experience engine 110. For example, a subject-specific cognitive tutor may be developed with no regard to learning theory whatsoever, and use a completely new structure and means for decision making, and that cognitive tutor may function as a learning system module that may be “plugged-in” to learning experience engine 110 by using an interface compatible with API 115.
  • Competing learning system modules may be used, allowing learning experience engine 110 to receive input from multiple modules, and then provide the best overall learning experience for the student. For example, an artificial intelligence based module, a taxonomy module, and a rule-based module that are all designed to teach a student to play the game of chess may be concurrently communicatively connected to API 115. Each module may store metadata associated with learning content stored in learning content manager 130, learning tools stored or indexed in learning tools manager 140 or any other data required to provide a response to learning experience engine 110. Learning experience engine 110 may request information, data, or recommendations from each module that it then uses to provide a learning recommendation to the student. Learning experience engine 110 may determine that the experience provided to the student is superior when a particular module is used during a particular portion of the learning experience. This decision may be based on any criteria. For example, the experience may be deemed superior based on the speed at which the student progresses, based on more superficial information, such as the delay incurred by using a less efficient module.
  • Since learning experience engine 110 is capable of concurrently receiving input from more than one module, new modules may be added to the learning management platform 100 and gradually integrated into the system. Each module may increase in importance and influence based on the merit as determined over time by learning experience engine 110. Alternatively, a new module may be configured by an administrator of the learning management platform 100 to be the primary module used for a particular subject, skill, or task. For example, an independently tested and proven module may be integrated into learning management platform 100 and immediately promoted as the module with the most merit, overriding any determination made by learning experience engine 110. Learning experience engine 110 may then be configured to perform additional merit determinations for the new module, as well as existing modules.
  • Alternative Embodiments
  • In an embodiment, learning experience engine 110 may provide more than one recommendation for the student, along with data about each recommendation such as data associated with content or learning objects. The student then may be able to choose a learning experience based on preference. Learning experience engine 110 may gather preference information for further processing by data analysis engine 150, and use the analysis to assist learning experience engine 110 in determining future recommendations for that user or cognitively similar users.
  • In another embodiment, multiple recommendations are provided, but learning space platform 120 may make the final decision regarding which content, assessment, or learning objects are presented to the user. This method allows local logic to offload processing of the final learning recommendation based on transitory environmental factors that are overriding, such as location, mobile network signal strength, or lighting detected by the device.
  • In another embodiment, learning experience engine 110 may provide a group of learning objects and associated content, assessments, and tools, along with logic to allow learning space platform to determine, based on assessment performance, which of the group of learning objects should be presented next. This method allows for extended offline learning. In this embodiment, learning space platform 120 may still detect connectivity and request an additional group of learning objects based on assessment performance, advancing the state of the locally stored data in order to keep a cache of offline learning information, content, and tools available at all times.
  • In another embodiment, a control system is included in learning management platform 100. The control system may be configured to select the next steps of the learning space navigation automatically or enable the navigation to include explicit steps set up by a faculty member, or a hybrid operation where a faculty member may decide to selectively override some of the steps of the skills hierarchy traversal.
  • In an embodiment, learning management platform 100 includes a personalized assessment system that is capable of taking different goal sets from students, faculty, institutions, and employers, and selecting the right set of assessments to ensure that the student has mastered the right skills in-line with the goals.
  • Example Method for Providing Learning Recommendations Based on Profile Similarity
  • In an embodiment, profiles are maintained for students by personal cognitive DNA manager 170 and knowledge base 160. These profiles may include PDNA, and describe one or more education-related attributes associated with students. A profile snapshot (PS1T1) that represents the state of the profile of a first user at a particular point in time (T1) is also maintained. A current profile (PS1T2) that represents a state of the profile of the first user at a second point in time (T2) is also maintained for the first user. Another current profile (PS2T2) that represents a state of the profile of a second user at a second point in time (T2) is also maintained.
  • In response to a request, for content, from the second user, learning management platform 100 determines, based at least in part on an attribute of the profile snapshot PS1T1 of the first user that is not an attribute of the current profile PS1T2 of the first user, that a similarity exists between the current profile of the second user and the profile snapshot of the first user. Based on the similarity between the out of date profile (PS1T1) of the first user and the current profile PS2T2 of the second user, a content item is selected for delivery to the second user.
  • For example, a first user may currently learn most effectively by using video-related learning tools, but that at time T1 the first user may have learned most effectively by reading text. If the second user, who currently learns most effectively by reading text, needs a learning tool, the current profile P2T2 of the second user can be matched to a snapshot of the first user's profile P1T1 (that was taken at time T1) to determine a learning recommendation for the second user. For example, if at time T1 the first user learned a concept well by reading text X, then the learning recommendation may be for the second user to read text X (even though that would not be the learning recommendation that would now be given to the first user).
  • Example Method for Recommending Content Based on Student Context
  • In an embodiment, learning objects are maintained by skills hierarchy manager 180. Profiles are maintained for students by personal cognitive DNA manager 170 and knowledge base 160. Each profile describes one or more transient attributes that change simultaneously with environmental or emotional circumstances associated with the corresponding user. For example, a student may feel tired or sad, and indicate this through a user interface provided by learning space platform 120. This information may be reported to learning experience engine 110, which may store the information in knowledge base 160. As another example, learning space platform 120 may be executing on a mobile device with GPS (Global Positioning System) capabilities, and may report location information to learning experience engine 110, which may store the information in knowledge base 160.
  • A profile may have one or more persistent attributes that describe characteristics of the student that are pertinent to educational activities, such as learning style. For example, a particular student may not learn effectively when exclusively using audio content. Although two students may have the same value for a particular transient attribute, they may have different values for a particular persistent attribute. For example, both students may be on a train (a transient attribute), but one student is unable to learn effectively using the type of content available for use while in transit (a persistent attribute) while the other student is able to learn effectively using the type of content available for use while in transit. Under these circumstances, learning experience engine 110 may decide to recommend one learning object to the student who is better able to learn on the train. The other student may receive a recommendation to wait until off the train to continue learning activities.
  • Example Method for Facilitating Targeted Interaction Between Students
  • In an embodiment, profiles are maintained for students by personal cognitive DNA manager 170 and knowledge base 160. A student sends a request for interaction with other users. For example, a particular student may want to discuss a particular educational concept with other students. Learning experience engine 110 detects a group of students that are interacting with the same learning object or similar learning objects to the learning object that the particular student is interacting with. Learning experience engine 110 then compares the profiles of the particular user with the group of students that are interacting with similar learning objects, and determines which of those students are similar to the particular student. At least one other student is selected to interact with the particular student based on this comparison. For example, a second student may be invited to a virtual whiteboard session or live chat session with the particular student.
  • Example Method for Competitive Learning Objects in a Skills Hierarchy
  • In an embodiment, a hierarchy of learning objects is maintained by skills hierarchy manager 180. Each learning object in the hierarchy is associated with a corresponding skill and content items that help students to master the skill. A particular node is occupied by two learning objects that are competing with one another to be the preferred learning object at that node. Both learning objects are associated with the same skill. However, the learning objects may be associated with different content or logic that defines different content preferences that cause different content to be delivered to different students, even though the circumstances of each student may be the same.
  • For example, two learning objects residing at the same node in a skills hierarchy may be designed to teach the skills required to perform integration by parts, concept in calculus. The first learning object may employ interactive learning games to teach the concept, while the second learning object may use a series of videos to teach the concept. The first learning object may be recommended to a first student, while the second learning object may be recommended to a second student, even though the first and second student are cognitively similar with respect to personal attributes associated with mathematics.
  • Example Method for Dynamically Altering Learning Object Remediation Preferences
  • In an embodiment, a hierarchy of learning objects is maintained by skills hierarchy manager 180. Each learning object in the hierarchy is associated with a corresponding skill and one or more corresponding assessment items. The assessment items measure the level of success that users attain with respect to the corresponding learning object. In response to determining that a first user has attained a first level of success with respect to a first learning object, a second learning object that is associated with a second skill is selected by learning experience engine 110 for recommendation to the first user.
  • In response to determining that a second user has attained the first level of success with respect to the first learning object, and based at least in part on determining that the first user has attained a second level of success with respect to the second learning object, learning experience engine 110 selects a third learning object to recommend to the second user, wherein the third learning object is associated with a third skill.
  • For example, a first student may have performed poorly on an assessment for a learning object designed to teach the user skills related to graphing polynomial functions. To help the first student bolster his skills, learning experience engine may recommend a second learning object to the first student. The skill associated with the second learning object may be related to graphing functions generally. However, the first student may also perform poorly on an assessment for the second learning object. This may be an indication that the second learning object is not a good remediator for the first learning object, given the level of performance demonstrated by the first student. Therefore, when a second student achieves the same level of performance on the assessment for the first learning object, the second student receives a recommendation for a third learning object, such as one that focuses on more general graphing skills.
  • Hardware Overview
  • According to one embodiment, the techniques described herein are implemented by one or more special-purpose computing devices. The special-purpose computing devices may be hard-wired to perform the techniques, or may include digital electronic devices such as one or more application-specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs) that are persistently programmed to perform the techniques, or may include one or more general purpose hardware processors programmed to perform the techniques pursuant to program instructions in firmware, memory, other storage, or a combination. Such special-purpose computing devices may also combine custom hard-wired logic, ASICs, or FPGAs with custom programming to accomplish the techniques. The special-purpose computing devices may be desktop computer systems, portable computer systems, handheld devices, networking devices or any other device that incorporates hard-wired and/or program logic to implement the techniques.
  • For example, FIG. 5 is a block diagram that illustrates a computer system 500 upon which an embodiment of the invention may be implemented. Computer system 500 includes a bus 502 or other communication mechanism for communicating information, and a hardware processor 504 coupled with bus 502 for processing information. Hardware processor 504 may be, for example, a general purpose microprocessor.
  • Computer system 500 also includes a main memory 506, such as a random access memory (RAM) or other dynamic storage device, coupled to bus 502 for storing information and instructions to be executed by processor 504. Main memory 506 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 504. Such instructions, when stored in non-transitory storage media accessible to processor 504, render computer system 500 into a special-purpose machine that is customized to perform the operations specified in the instructions.
  • Computer system 500 further includes a read only memory (ROM) 508 or other static storage device coupled to bus 502 for storing static information and instructions for processor 504. A storage device 510, such as a magnetic disk or optical disk, is provided and coupled to bus 502 for storing information and instructions.
  • Computer system 500 may be coupled via bus 502 to a display 512, such as a cathode ray tube (CRT), for displaying information to a computer user. An input device 514, including alphanumeric and other keys, is coupled to bus 502 for communicating information and command selections to processor 504. Another type of user input device is cursor control 516, such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 504 and for controlling cursor movement on display 512. This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), that allows the device to specify positions in a plane.
  • Computer system 500 may implement the techniques described herein using customized hard-wired logic, one or more ASICs or FPGAs, firmware and/or program logic which in combination with the computer system causes or programs computer system 500 to be a special-purpose machine. According to one embodiment, the techniques herein are performed by computer system 500 in response to processor 504 executing one or more sequences of one or more instructions contained in main memory 506. Such instructions may be read into main memory 506 from another storage medium, such as storage device 510. Execution of the sequences of instructions contained in main memory 506 causes processor 504 to perform the process steps described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions.
  • The term “storage media” as used herein refers to any non-transitory media that store data and/or instructions that cause a machine to operation in a specific fashion. Such storage media may comprise non-volatile media and/or volatile media. Non-volatile media includes, for example, optical or magnetic disks, such as storage device 510. Volatile media includes dynamic memory, such as main memory 506. Common forms of storage media include, for example, a floppy disk, a flexible disk, hard disk, solid state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge.
  • Storage media is distinct from but may be used in conjunction with transmission media. Transmission media participates in transferring information between storage media. For example, transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus 502. Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.
  • Various forms of media may be involved in carrying one or more sequences of one or more instructions to processor 504 for execution. For example, the instructions may initially be carried on a magnetic disk or solid state drive of a remote computer. The remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem. A modem local to computer system 500 can receive the data on the telephone line and use an infra-red transmitter to convert the data to an infra-red signal. An infra-red detector can receive the data carried in the infra-red signal and appropriate circuitry can place the data on bus 502. Bus 502 carries the data to main memory 506, from which processor 504 retrieves and executes the instructions. The instructions received by main memory 506 may optionally be stored on storage device 510 either before or after execution by processor 504.
  • Computer system 500 also includes a communication interface 518 coupled to bus 502. Communication interface 518 provides a two-way data communication coupling to a network link 520 that is connected to a local network 522. For example, communication interface 518 may be an integrated services digital network (ISDN) card, cable modem, satellite modem, or a modem to provide a data communication connection to a corresponding type of telephone line. As another example, communication interface 518 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN. Wireless links may also be implemented. In any such implementation, communication interface 518 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.
  • Network link 520 typically provides data communication through one or more networks to other data devices. For example, network link 520 may provide a connection through local network 522 to a host computer 524 or to data equipment operated by an Internet Service Provider (ISP) 526. ISP 526 in turn provides data communication services through the world wide packet data communication network now commonly referred to as the “Internet” 528. Local network 522 and Internet 528 both use electrical, electromagnetic or optical signals that carry digital data streams. The signals through the various networks and the signals on network link 520 and through communication interface 518, which carry the digital data to and from computer system 500, are example forms of transmission media.
  • Computer system 500 can send messages and receive data, including program code, through the network(s), network link 520 and communication interface 518. In the Internet example, a server 530 might transmit a requested code for an application program through Internet 528, ISP 526, local network 522 and communication interface 518.
  • The received code may be executed by processor 504 as it is received, and/or stored in storage device 510, or other non-volatile storage for later execution.
  • In the foregoing specification, embodiments of the invention have been described with reference to numerous specific details that may vary from implementation to implementation. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense. The sole and exclusive indicator of the scope of the invention, and what is intended by the applicants to be the scope of the invention, is the literal and equivalent scope of the set of claims that issue from this application, in the specific form in which such claims issue, including any subsequent correction.

Claims (18)

1. A method, comprising:
maintaining for a plurality of users of an online education system, profiles, wherein the profile maintained for each of the plurality of users describes one or more education-related attributes associated with the corresponding user of the plurality of users;
in response to a request, from a first user of the plurality of users, to interact with one or more other users of the plurality of users:
detecting which particular subject matter the first user is currently working on;
identifying a particular set of other users that are currently working on the particular subject matter or subject matter related to the particular subject matter;
based on a comparison between a first profile that is associated with the first user and a second profile that is associated with a second user of the particular set of users, determining that attribute values of the first profile are similar to attribute values of the second profile;
selecting the second user from among the particular set of users based, at least in part, on the determination that that attribute values of the first profile are similar to attribute values of the second profile;
facilitating interaction between the first user and the second user;
wherein the method is performed by one or more computing devices.
2. The method of claim 1, wherein facilitating interaction includes notifying at least one of the first user and the second user about the other of the first user and the second user.
3. The method of claim 1, wherein facilitating interaction includes providing to at least one of the first user and the second user a mechanism by which the at least one user may interact with the other of the first user and the second user.
4. The method of claim 1, wherein detecting that each user of a particular set of other users is currently working on the particular subject matter or subject matter related to the particular subject matter includes detecting that the second user is currently interacting on second subject matter that is associated with a course of study that the particular subject matter is associated with.
5. The method of claim 1, wherein detecting that each user of a particular set of other users is currently working on the particular subject matter or subject matter related to the particular subject matter includes detecting that the second user is currently working on the particular subject matter in association with a course of study with which the first user is not associated.
6. The method of claim 1, further comprising:
based on a comparison between the first profile that is associated with the first user and a third profile that is associated with a third user of the particular set of users, determining that attribute values of the first profile are similar to attribute values of the third profile;
selecting the third user to interact with the first user based at least in part on the determination that that attribute values of the first profile are similar to attribute values of the third profile.
7. The method of claim 1, further comprising:
based on a comparison between the first profile that is associated with the first user and a third profile that is associated with a third user of the particular set of users, determining that attribute values of the first profile are similar to attribute values of the third profile;
determining that the third user should not be selected to interact with the first user based at least in part on a determination that an optimal number of users have already been selected as members of a first interaction group that includes the first user.
8. The method of claim 7, wherein the optimal number of users is determined based at least in part in one or more education-related attributes stored in the profiles of members of the first group.
9. The method of claim 1, further comprising:
receiving, from a the second user, a learning recommendation directed to the first user, wherein the learning recommendation identifies one or more of: a learning tool; a subject matter other than the particular subject matter; or a learning content item.
10. A computer-readable non-transitory storage medium storing instructions which, when executed by one or more processors, cause the one or more processors to perform:
maintaining for a plurality of users of an online education system, profiles, wherein the profile maintained for each of the plurality of users describes one or more education-related attributes associated with the corresponding user of the plurality of users;
in response to a request, from a first user of the plurality of users, to interact with one or more other users of the plurality of users:
detecting which particular subject matter the first user is currently working on;
identifying a particular set of other users that are currently working on the particular subject matter or subject matter related to the particular subject matter;
based on a comparison between a first profile that is associated with the first user and a second profile that is associated with a second user of the particular set of users, determining that attribute values of the first profile are similar to attribute values of the second profile;
selecting the second user from among the particular set of users based, at least in part, on the determination that that attribute values of the first profile are similar to attribute values of the second profile;
facilitating interaction between the first user and the second user.
11. The computer-readable non-transitory storage medium of claim 10, wherein facilitating interaction includes notifying at least one of the first user and the second user about the other of the first user and the second user.
12. The computer-readable non-transitory storage medium of claim 10, wherein facilitating interaction includes providing to at least one of the first user and the second user a mechanism by which the at least one user may interact with the other of the first user and the second user.
13. The computer-readable non-transitory storage medium of claim 10, wherein detecting that each user of a particular set of other users is currently working on the particular subject matter or subject matter related to the particular subject matter includes detecting that the second user is currently interacting on second subject matter that is associated with a course of study that the particular subject matter is associated with.
14. The computer-readable non-transitory storage medium of claim 10, wherein detecting that each user of a particular set of other users is currently working on the particular subject matter or subject matter related to the particular subject matter includes detecting that the second user is currently working on the particular subject matter in association with a course of study with which the first user is not associated.
15. The computer-readable non-transitory storage medium of claim 10, wherein the instructions further include instructions for:
based on a comparison between the first profile that is associated with the first user and a third profile that is associated with a third user of the particular set of users, determining that attribute values of the first profile are similar to attribute values of the third profile;
selecting the third user to interact with the first user based at least in part on the determination that that attribute values of the first profile are similar to attribute values of the third profile.
16. The computer-readable non-transitory storage medium of claim 10, wherein the instructions further include instructions for:
based on a comparison between the first profile that is associated with the first user and a third profile that is associated with a third user of the particular set of users, determining that attribute values of the first profile are similar to attribute values of the third profile;
determining that the third user should not be selected to interact with the first user based at least in part on a determination that an optimal number of users have already been selected as members of a first interaction group that includes the first user.
17. The computer-readable non-transitory storage medium of claim 16, wherein the optimal number of users is determined based at least in part in one or more education-related attributes stored in the profiles of members of the first group.
18. The computer-readable non-transitory storage medium of claim 10, wherein the instructions further include instructions for:
receiving, from a the second user, a learning recommendation directed to the first user, wherein the learning recommendation identifies one or more of: a learning tool; a subject matter other than the particular subject matter; or a learning content item.
US13/007,166 2010-01-15 2011-01-14 Facilitating targeted interaction in a networked learning environment Abandoned US20110177482A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US13/007,166 US20110177482A1 (en) 2010-01-15 2011-01-14 Facilitating targeted interaction in a networked learning environment
US13/408,914 US9583016B2 (en) 2010-01-15 2012-02-29 Facilitating targeted interaction in a networked learning environment

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US29563510P 2010-01-15 2010-01-15
US33415810P 2010-05-12 2010-05-12
US13/007,166 US20110177482A1 (en) 2010-01-15 2011-01-14 Facilitating targeted interaction in a networked learning environment

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/408,914 Continuation US9583016B2 (en) 2010-01-15 2012-02-29 Facilitating targeted interaction in a networked learning environment

Publications (1)

Publication Number Publication Date
US20110177482A1 true US20110177482A1 (en) 2011-07-21

Family

ID=43728748

Family Applications (6)

Application Number Title Priority Date Filing Date
US13/007,147 Abandoned US20110177480A1 (en) 2010-01-15 2011-01-14 Dynamically recommending learning content
US13/007,166 Abandoned US20110177482A1 (en) 2010-01-15 2011-01-14 Facilitating targeted interaction in a networked learning environment
US13/007,177 Abandoned US20110177483A1 (en) 2010-01-15 2011-01-14 Recommending competitive learning objects
US13/408,914 Active US9583016B2 (en) 2010-01-15 2012-02-29 Facilitating targeted interaction in a networked learning environment
US13/408,937 Abandoned US20120164620A1 (en) 2010-01-15 2012-02-29 Recommending competitive learning objects
US13/916,348 Abandoned US20130266922A1 (en) 2010-01-15 2013-06-12 Recommending Competitive Learning Objects

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/007,147 Abandoned US20110177480A1 (en) 2010-01-15 2011-01-14 Dynamically recommending learning content

Family Applications After (4)

Application Number Title Priority Date Filing Date
US13/007,177 Abandoned US20110177483A1 (en) 2010-01-15 2011-01-14 Recommending competitive learning objects
US13/408,914 Active US9583016B2 (en) 2010-01-15 2012-02-29 Facilitating targeted interaction in a networked learning environment
US13/408,937 Abandoned US20120164620A1 (en) 2010-01-15 2012-02-29 Recommending competitive learning objects
US13/916,348 Abandoned US20130266922A1 (en) 2010-01-15 2013-06-12 Recommending Competitive Learning Objects

Country Status (7)

Country Link
US (6) US20110177480A1 (en)
EP (1) EP2524362A1 (en)
CN (1) CN102822882B (en)
BR (1) BR112012017226A8 (en)
CA (1) CA2787133A1 (en)
MX (1) MX2012008206A (en)
WO (1) WO2011088412A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110008758A1 (en) * 2009-07-08 2011-01-13 Target Brands, Inc. Training Simulator
US20110306028A1 (en) * 2010-06-15 2011-12-15 Galimore Sarah E Educational decision support system and associated methods
US8214691B1 (en) * 2011-09-30 2012-07-03 Google Inc. System and method for testing internet functionality of a computing device in a simulated internet environment
US20130078598A1 (en) * 2011-09-12 2013-03-28 Uq, Inc. Family and child account social networking
US20130311416A1 (en) * 2012-05-16 2013-11-21 Xerox Corporation Recommending training programs
US8731455B2 (en) 2012-08-21 2014-05-20 Minapsys Software Corporation Computer-implemented method for facilitating creation of an advanced digital communications network, and terminal, system and computer-readable medium for the same
US20160062562A1 (en) * 2014-08-30 2016-03-03 Apollo Education Group, Inc. Automatic processing with multi-selection interface
US20180357703A1 (en) * 2013-03-15 2018-12-13 Sears Brands, L.L.C. Recommendations Based Upon Explicit User Similarity
CN113630461A (en) * 2021-08-05 2021-11-09 东南大学 Online collaborative learning user grouping method based on user interaction trust network

Families Citing this family (176)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9076342B2 (en) 2008-02-19 2015-07-07 Architecture Technology Corporation Automated execution and evaluation of network-based training exercises
US9318026B2 (en) 2008-08-21 2016-04-19 Lincoln Global, Inc. Systems and methods providing an enhanced user experience in a real-time simulated virtual reality welding environment
US9196169B2 (en) 2008-08-21 2015-11-24 Lincoln Global, Inc. Importing and analyzing external data using a virtual reality welding system
US9280913B2 (en) 2009-07-10 2016-03-08 Lincoln Global, Inc. Systems and methods providing enhanced education and training in a virtual reality environment
US8117306B1 (en) 2008-09-29 2012-02-14 Amazon Technologies, Inc. Optimizing content management
US7865594B1 (en) 2008-09-29 2011-01-04 Amazon Technologies, Inc. Managing resources consolidation configurations
US7930393B1 (en) 2008-09-29 2011-04-19 Amazon Technologies, Inc. Monitoring domain allocation performance
US8122124B1 (en) 2008-09-29 2012-02-21 Amazon Technologies, Inc. Monitoring performance and operation of data exchanges
US8316124B1 (en) 2008-09-29 2012-11-20 Amazon Technologies, Inc. Managing network data display
US7917618B1 (en) 2009-03-24 2011-03-29 Amazon Technologies, Inc. Monitoring web site content
US20110039245A1 (en) * 2009-08-14 2011-02-17 Ronald Jay Packard Systems and methods for producing, delivering and managing educational material
US8838015B2 (en) 2009-08-14 2014-09-16 K12 Inc. Systems and methods for producing, delivering and managing educational material
US8768240B2 (en) 2009-08-14 2014-07-01 K12 Inc. Systems and methods for producing, delivering and managing educational material
US20110039247A1 (en) * 2009-08-14 2011-02-17 Ronald Jay Packard Systems and methods for producing, delivering and managing educational material
US20110039242A1 (en) * 2009-08-14 2011-02-17 Ronald Jay Packard Systems and methods for producing, delivering and managing educational material
US20110177480A1 (en) * 2010-01-15 2011-07-21 Satish Menon Dynamically recommending learning content
US20120197733A1 (en) 2011-01-27 2012-08-02 Linkedln Corporation Skill customization system
WO2012159200A1 (en) * 2011-05-23 2012-11-29 Coursepeer Inc. Recommending students to prospective employers based on students' online content
US9043444B2 (en) * 2011-05-25 2015-05-26 Google Inc. Using an audio stream to identify metadata associated with a currently playing television program
US9064017B2 (en) * 2011-06-01 2015-06-23 D2L Corporation Systems and methods for providing information incorporating reinforcement-based learning and feedback
US20130017530A1 (en) * 2011-07-11 2013-01-17 Learning Center Of The Future, Inc. Method and apparatus for testing students
US20130016044A1 (en) * 2011-07-11 2013-01-17 Learning Center Of The Future, Inc. Method and apparatus for selecting educational content
TWI505054B (en) * 2011-07-22 2015-10-21 Yu Chen Lee The use of computer production and training feedback function of the teaching materials
AU2012312055B2 (en) * 2011-09-21 2017-02-09 ValueCorp Pacific, Inc. System and method for mathematics ontology extraction and research
US20130095461A1 (en) 2011-10-12 2013-04-18 Satish Menon Course skeleton for adaptive learning
US20130236877A1 (en) * 2011-11-02 2013-09-12 Andrew H. B. Zhou Systems and methods for providing educational products and services via cloud massive online open course
TWI453701B (en) * 2011-12-30 2014-09-21 Univ Chienkuo Technology Cloud video content evaluation platform
US20130203026A1 (en) * 2012-02-08 2013-08-08 Jpmorgan Chase Bank, Na System and Method for Virtual Training Environment
US9971993B2 (en) 2012-03-26 2018-05-15 Microsoft Technology Licensing, Llc Leveraging a social graph for use with electronic messaging
US20130260355A1 (en) * 2012-04-02 2013-10-03 Gobstopper, Inc. Teaching and learning system
KR101246264B1 (en) * 2012-04-30 2013-03-22 조현구 Method for providing educational community according to age combined with social networking service
US9361807B2 (en) 2012-05-22 2016-06-07 Sri International Method and apparatus for providing collaborative learning
US10403163B2 (en) * 2012-05-22 2019-09-03 Sri International Method and system for providing collaborative learning
WO2014017164A1 (en) * 2012-07-26 2014-01-30 ソニー株式会社 Information processing device, information processing method, and system
US10956956B2 (en) * 2012-08-17 2021-03-23 Ebay Inc. System, method, and computer readable medium for recommendations based on wearable sensors
US9412281B2 (en) * 2013-11-25 2016-08-09 Pearson Education, Inc. Learning system self-optimization
US9406239B2 (en) 2013-12-20 2016-08-02 Pearson Education, Inc. Vector-based learning path
US9436911B2 (en) 2012-10-19 2016-09-06 Pearson Education, Inc. Neural networking system and methods
US9446314B2 (en) 2013-10-25 2016-09-20 Pearson Education, Inc. Vector-based gaming content management
US9288056B1 (en) 2015-05-28 2016-03-15 Pearson Education, Inc. Data access and anonymity management
US20160042198A1 (en) 2012-10-19 2016-02-11 Pearson Education, Inc. Deidentified access of content
US8984650B2 (en) 2012-10-19 2015-03-17 Pearson Education, Inc. Privacy server for protecting personally identifiable information
US9654592B2 (en) 2012-11-08 2017-05-16 Linkedin Corporation Skills endorsements
US20140172844A1 (en) * 2012-12-14 2014-06-19 SRM Institute of Science and Technology System and Method For Generating Student Activity Maps in A University
US20140178849A1 (en) * 2012-12-24 2014-06-26 Dan Dan Yang Computer-assisted learning structure for very young children
CN103533009A (en) * 2013-01-06 2014-01-22 Tcl集团股份有限公司 Method and system of realizing audio and video recommendation based on Web technology
WO2014142621A1 (en) * 2013-03-14 2014-09-18 ㈜엠피디에이 Neural adaptive learning device and neural adaptive learning method using relational concept map
US20140272901A1 (en) * 2013-03-14 2014-09-18 Alcatel-Lucent Methods and apparatus for providing alternative educational content for personalized learning in a class situation
US20140315163A1 (en) * 2013-03-14 2014-10-23 Apple Inc. Device, method, and graphical user interface for a group reading environment
WO2014160316A2 (en) * 2013-03-14 2014-10-02 Apple Inc. Device, method, and graphical user interface for a group reading environment
US20150302088A1 (en) * 2013-03-15 2015-10-22 Yahoo! Inc. Method and System for Providing Personalized Content
TWI490827B (en) * 2013-05-13 2015-07-01 Univ Nat Cheng Kung Real-time video annotation learning system and method thereof
US20150004586A1 (en) * 2013-06-26 2015-01-01 Kyle Tomson Multi-level e-book
US9189968B2 (en) 2013-07-01 2015-11-17 Pearson Education, Inc. Network-probability recommendation system
US10049593B2 (en) 2013-07-15 2018-08-14 International Business Machines Corporation Automated educational system
CN103390357A (en) * 2013-07-24 2013-11-13 天津开发区先特网络系统有限公司 Training and study service device, training system and training information management method
TWI501207B (en) * 2013-08-30 2015-09-21 Method and system for providing landmark services through landmark database
US20150072323A1 (en) 2013-09-11 2015-03-12 Lincoln Global, Inc. Learning management system for a real-time simulated virtual reality welding training environment
US10013892B2 (en) 2013-10-07 2018-07-03 Intel Corporation Adaptive learning environment driven by real-time identification of engagement level
WO2015061415A1 (en) * 2013-10-22 2015-04-30 Exploros, Inc. System and method for collaborative instruction
US9547995B1 (en) 2013-11-18 2017-01-17 Google Inc. Dynamic instructional course
EP3591552B1 (en) * 2013-12-19 2022-03-30 Intel Corporation Protection system including machine learning snapshot evaluation
CN105849792B (en) * 2013-12-23 2020-01-14 Abb瑞士股份有限公司 Interactive interface for asset health management
KR101401722B1 (en) * 2013-12-24 2014-05-30 박형용 Individual order type online learning system
US10891334B2 (en) 2013-12-29 2021-01-12 Hewlett-Packard Development Company, L.P. Learning graph
US20150199911A1 (en) * 2014-01-10 2015-07-16 Laura Paramoure Systems and methods for creating and managing repeatable and measurable learning content
US20150199910A1 (en) * 2014-01-10 2015-07-16 Cox Communications, Inc. Systems and methods for an educational platform providing a multi faceted learning environment
US9576494B2 (en) 2014-01-29 2017-02-21 Apollo Education Group, Inc. Resource resolver
US20150243176A1 (en) * 2014-02-24 2015-08-27 Mindojo Ltd. Virtual course boundaries in adaptive e-learning datagraph structures
US9997083B2 (en) 2014-05-29 2018-06-12 Samsung Electronics Co., Ltd. Context-aware recommendation system for adaptive learning
US10467304B1 (en) * 2014-05-30 2019-11-05 Better Learning, Inc. Recommending educational mobile applications and assessing student progress in meeting education standards correlated to the applications
CN106233358A (en) * 2014-06-02 2016-12-14 林肯环球股份有限公司 System and method for artificial welders training
CN105225553B (en) * 2014-06-17 2018-06-12 中兴通讯股份有限公司 The recommendation method and device of course in online education
US20160027322A1 (en) * 2014-07-28 2016-01-28 Summers & Company Cognitive-skills training system
CN104166713A (en) * 2014-08-14 2014-11-26 百度在线网络技术(北京)有限公司 Network service recommending method and device
US20170084190A1 (en) * 2014-08-21 2017-03-23 BrainQuake Inc Method for Efficiently Teaching Content Using an Adaptive Engine
US10832585B2 (en) 2014-09-26 2020-11-10 Hewlett-Packard Development Company, L.P. Reading progress indicator
US20180253991A1 (en) * 2014-11-03 2018-09-06 Verily Life Sciences Llc Methods and Systems for Improving a Presentation Function of a Client Device
US10347151B2 (en) 2014-11-10 2019-07-09 International Business Machines Corporation Student specific learning graph
US10496420B2 (en) * 2014-12-02 2019-12-03 Cerner Innovation, Inc. Contextual help within an application
US10027739B1 (en) 2014-12-16 2018-07-17 Amazon Technologies, Inc. Performance-based content delivery
US9769248B1 (en) 2014-12-16 2017-09-19 Amazon Technologies, Inc. Performance-based content delivery
US10225365B1 (en) 2014-12-19 2019-03-05 Amazon Technologies, Inc. Machine learning based content delivery
US10311372B1 (en) 2014-12-19 2019-06-04 Amazon Technologies, Inc. Machine learning based content delivery
US9792335B2 (en) 2014-12-19 2017-10-17 International Business Machines Corporation Creating and discovering learning content in a social learning system
US10311371B1 (en) * 2014-12-19 2019-06-04 Amazon Technologies, Inc. Machine learning based content delivery
US9779632B2 (en) * 2014-12-30 2017-10-03 Successfactors, Inc. Computer automated learning management systems and methods
US20160189036A1 (en) * 2014-12-30 2016-06-30 Cirrus Shakeri Computer automated learning management systems and methods
US10225326B1 (en) 2015-03-23 2019-03-05 Amazon Technologies, Inc. Point of presence based data uploading
EP3093803A1 (en) 2015-04-30 2016-11-16 Tata Consultancy Services Limited Systems and methods for contextual recommendation of learning content
US10733898B2 (en) * 2015-06-03 2020-08-04 D2L Corporation Methods and systems for modifying a learning path for a user of an electronic learning system
US20160358494A1 (en) * 2015-06-03 2016-12-08 D2L Corporation Methods and systems for providing a learning path for an electronic learning system
US9852648B2 (en) * 2015-07-10 2017-12-26 Fujitsu Limited Extraction of knowledge points and relations from learning materials
US10803766B1 (en) 2015-07-28 2020-10-13 Architecture Technology Corporation Modular training of network-based training exercises
US10083624B2 (en) 2015-07-28 2018-09-25 Architecture Technology Corporation Real-time monitoring of network-based training exercises
US10614368B2 (en) 2015-08-28 2020-04-07 Pearson Education, Inc. System and method for content provisioning with dual recommendation engines
US9614734B1 (en) 2015-09-10 2017-04-04 Pearson Education, Inc. Mobile device session analyzer
CN106682035A (en) * 2015-11-11 2017-05-17 中国移动通信集团公司 Individualized learning recommendation method and device
WO2017152187A1 (en) * 2016-03-04 2017-09-08 Civitas Learning, Inc. Student data-to-insight-to-action-to-learning analytics system and method
US20170270811A1 (en) * 2016-03-15 2017-09-21 International Business Machines Corporation Lesson plan presentation
US20170270812A1 (en) * 2016-03-16 2017-09-21 Wen Tsung Chu Method for learning assessment
US10325215B2 (en) 2016-04-08 2019-06-18 Pearson Education, Inc. System and method for automatic content aggregation generation
CN105975520B (en) * 2016-04-29 2019-07-23 东北大学 A kind of the individualized learning scheme custom-built system and its method for customizing of flexible configuration
US20170316380A1 (en) * 2016-04-29 2017-11-02 Ceb Inc. Profile enrichment
US20170352117A1 (en) * 2016-06-01 2017-12-07 Coursera, Inc. Automated cohorts for sessions
US10521424B1 (en) 2016-09-21 2019-12-31 Workday, Inc. Educational learning searching using personalized relevance scoring
US11138894B1 (en) * 2016-09-21 2021-10-05 Workday, Inc. Educational learning importation
US20180089570A1 (en) * 2016-09-29 2018-03-29 Linkedin Corporation Skills detector system
CA3040775A1 (en) 2016-10-18 2018-04-26 Minute School Inc. Systems and methods for providing tailored educational materials
CN106528656B (en) * 2016-10-20 2019-08-20 杭州新百锐基业科技股份有限公司 A kind of method and system for realizing that course is recommended based on student's history and real-time learning state parameter
US10380552B2 (en) 2016-10-31 2019-08-13 Microsoft Technology Licensing, Llc Applicant skills inference for a job
US20190318644A1 (en) * 2016-11-23 2019-10-17 Nelson Education Ltd. End to end educational system and method
US11069250B2 (en) * 2016-11-23 2021-07-20 Sharelook Pte. Ltd. Maze training platform
US11188992B2 (en) * 2016-12-01 2021-11-30 Microsoft Technology Licensing, Llc Inferring appropriate courses for recommendation based on member characteristics
US20180158023A1 (en) * 2016-12-02 2018-06-07 Microsoft Technology Licensing, Llc Project-related entity analysis
US10453354B2 (en) * 2016-12-28 2019-10-22 Coursera, Inc. Automatically generated flash cards
WO2018125893A1 (en) * 2016-12-29 2018-07-05 Becton, Dickinson And Company Digital web-based education platform for delivering targeted and individualized training on medical condition management to users
US10572813B2 (en) 2017-02-13 2020-02-25 Pearson Education, Inc. Systems and methods for delivering online engagement driven by artificial intelligence
US10490092B2 (en) 2017-03-17 2019-11-26 Age Of Learning, Inc. Personalized mastery learning platforms, systems, media, and methods
CN106846960A (en) * 2017-04-21 2017-06-13 江苏开放大学 A kind of network on-line study assessment system
US10540683B2 (en) * 2017-04-24 2020-01-21 Microsoft Technology Licensing, Llc Machine-learned recommender system for performance optimization of network-transferred electronic content items
CN107092706A (en) * 2017-05-31 2017-08-25 海南大学 The study point and learning path of a kind of target drives based on collection of illustrative plates towards 5W recommend method
CN109040797B (en) * 2017-06-08 2020-06-02 深圳市鹰硕技术有限公司 Internet teaching recording and broadcasting system and method
US10467551B2 (en) 2017-06-12 2019-11-05 Ford Motor Company Portable privacy management
US20190025906A1 (en) 2017-07-21 2019-01-24 Pearson Education, Inc. Systems and methods for virtual reality-based assessment
US20190087748A1 (en) * 2017-09-21 2019-03-21 International Business Machines Corporation Implementing dynamically and automatically altering user profile for enhanced performance
WO2019090434A1 (en) * 2017-11-09 2019-05-16 I-Onconnect Technologies Inc. Method and system for providing education guidance to a user
WO2019125109A1 (en) * 2017-12-19 2019-06-27 Dino Alejandro Pardo Guzman System for assigning mathematics learning rewards to motivate students to follow learning paths
WO2019125106A1 (en) * 2017-12-19 2019-06-27 Pacheco Navarro Diana Model for generating learning paths
CN108231090A (en) * 2018-01-02 2018-06-29 深圳市酷开网络科技有限公司 Text reading level appraisal procedure, device and computer readable storage medium
US11557223B2 (en) 2018-04-19 2023-01-17 Lincoln Global, Inc. Modular and reconfigurable chassis for simulated welding training
US11475792B2 (en) 2018-04-19 2022-10-18 Lincoln Global, Inc. Welding simulator with dual-user configuration
CN108596804A (en) * 2018-04-28 2018-09-28 重庆玮宜电子科技有限公司 Multithreading online education evaluation method
US11200268B2 (en) 2018-05-04 2021-12-14 International Business Machines Corporation Determining gaps in data
US11210965B2 (en) * 2018-05-17 2021-12-28 Pearson Education, Inc. Diagnostic analyzer for visual-spatial content
US10749890B1 (en) 2018-06-19 2020-08-18 Architecture Technology Corporation Systems and methods for improving the ranking and prioritization of attack-related events
US10817604B1 (en) 2018-06-19 2020-10-27 Architecture Technology Corporation Systems and methods for processing source codes to detect non-malicious faults
CN109064126B (en) * 2018-07-18 2021-03-26 长江勘测规划设计研究有限责任公司 Hydropower engineering metering information management method based on quality driving
TWI725375B (en) * 2018-09-07 2021-04-21 台達電子工業股份有限公司 Data search method and data search system thereof
EP3620936A1 (en) 2018-09-07 2020-03-11 Delta Electronics, Inc. System and method for recommending multimedia data
US11380211B2 (en) 2018-09-18 2022-07-05 Age Of Learning, Inc. Personalized mastery learning platforms, systems, media, and methods
US11302208B2 (en) * 2018-10-05 2022-04-12 International Business Machines Corporation Dynamically providing video checkpoint questions to counter viewer distraction events
CN109710811B (en) * 2018-11-28 2021-03-02 汉海信息技术(上海)有限公司 User portrait detection method, device and application system
US11429713B1 (en) 2019-01-24 2022-08-30 Architecture Technology Corporation Artificial intelligence modeling for cyber-attack simulation protocols
US11128654B1 (en) 2019-02-04 2021-09-21 Architecture Technology Corporation Systems and methods for unified hierarchical cybersecurity
US20200286103A1 (en) * 2019-03-04 2020-09-10 Iris.Tv, Inc. Selecting digital media assets based on transitions across categories
US11416558B2 (en) 2019-03-29 2022-08-16 Indiavidual Learning Private Limited System and method for recommending personalized content using contextualized knowledge base
US11887505B1 (en) 2019-04-24 2024-01-30 Architecture Technology Corporation System for deploying and monitoring network-based training exercises
US20220230731A1 (en) * 2019-05-30 2022-07-21 Acerar Ltd. System and method for cognitive training and monitoring
US11514806B2 (en) 2019-06-07 2022-11-29 Enduvo, Inc. Learning session comprehension
US20200388175A1 (en) * 2019-06-07 2020-12-10 Enduvo, Inc. Creating a multi-disciplined learning tool
US11403405B1 (en) 2019-06-27 2022-08-02 Architecture Technology Corporation Portable vulnerability identification tool for embedded non-IP devices
US11417228B2 (en) * 2019-09-18 2022-08-16 International Business Machines Corporation Modification of extended reality environments based on learning characteristics
US11444974B1 (en) 2019-10-23 2022-09-13 Architecture Technology Corporation Systems and methods for cyber-physical threat modeling
US11074476B2 (en) 2019-11-21 2021-07-27 AstrumU, Inc. Data ingestion platform
US10908933B1 (en) * 2019-12-05 2021-02-02 Microsoft Technology Licensing, Llc Brokerage tool for accessing cloud-based services
US11503075B1 (en) 2020-01-14 2022-11-15 Architecture Technology Corporation Systems and methods for continuous compliance of nodes
US11468780B2 (en) 2020-02-20 2022-10-11 Gopalakrishnan Venkatasubramanyam Smart-learning and knowledge retrieval system
US11462117B2 (en) * 2020-05-19 2022-10-04 Enduvo, Inc. Creating lesson asset information
US11922595B2 (en) 2020-05-19 2024-03-05 Enduvo, Inc. Redacting content in a virtual reality environment
US20230138245A1 (en) 2020-05-27 2023-05-04 Nec Corporation Skill visualization device, skill visualization method, and skill visualization program
US20220068153A1 (en) * 2020-09-02 2022-03-03 Cerego Japan Kabushiki Kaisha Personalized learning system
CN112163491B (en) * 2020-09-21 2023-09-01 百度在线网络技术(北京)有限公司 Online learning method, device, equipment and storage medium
US11928607B2 (en) 2020-10-30 2024-03-12 AstrumU, Inc. Predictive learner recommendation platform
US11074509B1 (en) * 2020-10-30 2021-07-27 AstrumU, Inc. Predictive learner score
CN112700688B (en) * 2020-12-25 2021-09-24 电子科技大学 Intelligent classroom teaching auxiliary system
US11462118B1 (en) * 2021-03-12 2022-10-04 International Business Machines Corporation Cognitive generation of learning path framework
CN117178311A (en) * 2021-04-08 2023-12-05 启智知识产权有限责任公司 System and method for learner growth tracking and assessment
DE102021109615A1 (en) 2021-04-16 2022-10-20 Stiftung Sternenstaub communication system
WO2022244952A1 (en) * 2021-05-20 2022-11-24 주식회사 나인커뮤니케이션 Certification-related content provision method and system, and non-transitory computer-readable recording medium
US20220374812A1 (en) * 2021-05-24 2022-11-24 Skillsacpe Analytics LLC Systems and methods for generation and traversal of a skill representation graph using machine learning
US11645095B2 (en) * 2021-09-14 2023-05-09 Adobe Inc. Generating and utilizing a digital knowledge graph to provide contextual recommendations in digital content editing applications
WO2023095474A1 (en) * 2021-11-24 2023-06-01 ソニーグループ株式会社 Information processing device, information processing method, and program
CN114519143A (en) * 2022-02-18 2022-05-20 北京百度网讯科技有限公司 Course recommendation model training method, course recommendation method and device
US11847172B2 (en) 2022-04-29 2023-12-19 AstrumU, Inc. Unified graph representation of skills and acumen
WO2023220824A1 (en) * 2022-05-17 2023-11-23 Atanasiu Cosmin Petrut System for knowledge collaboration

Citations (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5727950A (en) * 1996-05-22 1998-03-17 Netsage Corporation Agent based instruction system and method
US5957699A (en) * 1997-12-22 1999-09-28 Scientific Learning Corporation Remote computer-assisted professionally supervised teaching system
US6146148A (en) * 1996-09-25 2000-11-14 Sylvan Learning Systems, Inc. Automated testing and electronic instructional delivery and student management system
US6149441A (en) * 1998-11-06 2000-11-21 Technology For Connecticut, Inc. Computer-based educational system
US6301462B1 (en) * 1999-01-15 2001-10-09 Unext. Com Online collaborative apprenticeship
US20010031456A1 (en) * 1999-12-30 2001-10-18 Greg Cynaumon Education system and method for providing educational exercises and establishing an educational fund
US6334779B1 (en) * 1994-03-24 2002-01-01 Ncr Corporation Computer-assisted curriculum
US20020142278A1 (en) * 2001-03-29 2002-10-03 Whitehurst R. Alan Method and system for training in an adaptive manner
US20020188583A1 (en) * 2001-05-25 2002-12-12 Mark Rukavina E-learning tool for dynamically rendering course content
US20020192629A1 (en) * 2001-05-30 2002-12-19 Uri Shafrir Meaning equivalence instructional methodology (MEIM)
US20030003433A1 (en) * 2001-06-29 2003-01-02 Ignite, Inc. Method and system for constructive, modality focused learning
US6505031B1 (en) * 2000-02-25 2003-01-07 Robert Slider System and method for providing a virtual school environment
US20030039948A1 (en) * 2001-08-09 2003-02-27 Donahue Steven J. Voice enabled tutorial system and method
US20040202987A1 (en) * 2003-02-14 2004-10-14 Scheuring Sylvia Tidwell System and method for creating, assessing, modifying, and using a learning map
US20050058978A1 (en) * 2003-09-12 2005-03-17 Benevento Francis A. Individualized learning system
US20050186550A1 (en) * 2004-02-23 2005-08-25 Mubina Gillani System and method for dynamic electronic learning based on continuing student assessments and responses
US20060068367A1 (en) * 2004-08-20 2006-03-30 Parke Helen M System and method for content management in a distributed learning system
US20060188860A1 (en) * 2005-02-24 2006-08-24 Altis Avante, Inc. On-task learning system and method
US20060218225A1 (en) * 2005-03-28 2006-09-28 Hee Voon George H Device for sharing social network information among users over a network
US7165054B2 (en) * 2001-09-14 2007-01-16 Knowledgextensions, Inc. Custom electronic learning system and method
US20070099161A1 (en) * 2005-10-31 2007-05-03 Krebs Andreas S Dynamic learning courses
US20070203589A1 (en) * 2005-04-08 2007-08-30 Manyworlds, Inc. Adaptive Recombinant Process Methods
US20070202484A1 (en) * 2006-02-28 2007-08-30 Michael Toombs Method and System for Educating Individuals
US20070218446A1 (en) * 2006-03-03 2007-09-20 Burck Smith Student interaction management system
US20070300174A1 (en) * 2006-06-27 2007-12-27 Microsoft Corporation Monitoring group activities
US20080014569A1 (en) * 2006-04-07 2008-01-17 Eleutian Technology, Llc Teacher Assisted Internet Learning
US20080038705A1 (en) * 2006-07-14 2008-02-14 Kerns Daniel R System and method for assessing student progress and delivering appropriate content
US20080134045A1 (en) * 2006-07-13 2008-06-05 Neustar, Inc. System and method for adaptively and dynamically configuring a graphical user interface of a mobile communication device
US20080222295A1 (en) * 2006-11-02 2008-09-11 Addnclick, Inc. Using internet content as a means to establish live social networks by linking internet users to each other who are simultaneously engaged in the same and/or similar content
US20090035733A1 (en) * 2007-08-01 2009-02-05 Shmuel Meitar Device, system, and method of adaptive teaching and learning
US20090186329A1 (en) * 2008-01-23 2009-07-23 Carol Connor Method for recommending a teaching plan in literacy education
US20100035225A1 (en) * 2006-07-11 2010-02-11 President And Fellows Of Harvard College Adaptive spaced teaching method and system
US20100041007A1 (en) * 2008-08-13 2010-02-18 Chi Wang Method and System for Knowledge Diagnosis and Tutoring
US7677896B1 (en) * 2002-02-19 2010-03-16 Nishikant Sonwalkar System, method, and computer-readable medium for course structure design to support adaptive learning
US20110010328A1 (en) * 2009-07-10 2011-01-13 Medimpact Healthcare Systems, Inc. Modifying a Patient Adherence Score
US20110039249A1 (en) * 2009-08-14 2011-02-17 Ronald Jay Packard Systems and methods for producing, delivering and managing educational material
US20110126253A1 (en) * 2009-11-20 2011-05-26 At&T Intellectual Property I, L.P. Apparatus and method for managing a social network
US20110177483A1 (en) * 2010-01-15 2011-07-21 Catherine Needham Recommending competitive learning objects
US20130095465A1 (en) * 2011-10-12 2013-04-18 Satish Menon Course skeleton for adaptive learning

Family Cites Families (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04357549A (en) 1991-03-07 1992-12-10 Hitachi Ltd Education system
US5493688A (en) 1991-07-05 1996-02-20 Booz, Allen & Hamilton, Inc. Pattern categoritzation system having self-organizing analog fields
US5779486A (en) * 1996-03-19 1998-07-14 Ho; Chi Fai Methods and apparatus to assess and enhance a student's understanding in a subject
US6039575A (en) * 1996-10-24 2000-03-21 National Education Corporation Interactive learning system with pretest
US6091930A (en) 1997-03-04 2000-07-18 Case Western Reserve University Customizable interactive textbook
US6322366B1 (en) 1998-06-30 2001-11-27 Assessment Technology Inc. Instructional management system
US6353447B1 (en) * 1999-01-26 2002-03-05 Microsoft Corporation Study planner system and method
AU2733701A (en) * 1999-12-30 2001-07-16 Game Wise, Llc An education system and method for providing educational exercises and establishing an educational fund
US20010049087A1 (en) 2000-01-03 2001-12-06 Hale Janet B. System and method of distance education
US6971881B2 (en) 2000-05-11 2005-12-06 Thomas J Reynolds Interactive method and system for teaching decision making
US6507726B1 (en) * 2000-06-30 2003-01-14 Educational Standards And Certifications, Inc. Computer implemented education system
US20020049689A1 (en) 2000-10-20 2002-04-25 Srinivas Venkatram Systems and methods for visual optimal ordered knowledge learning structures
US6606480B1 (en) * 2000-11-02 2003-08-12 National Education Training Group, Inc. Automated system and method for creating an individualized learning program
US6626679B2 (en) * 2000-11-08 2003-09-30 Acesync, Inc. Reflective analysis system
EP1362337A1 (en) 2001-01-09 2003-11-19 Prep4 Ltd Training system and method for improving user knowledge and skills
JP3731868B2 (en) 2001-03-19 2006-01-05 本田技研工業株式会社 Learning system
US7210938B2 (en) 2001-05-09 2007-05-01 K12.Com System and method of virtual schooling
US6782396B2 (en) 2001-05-31 2004-08-24 International Business Machines Corporation Aligning learning capabilities with teaching capabilities
US20040018479A1 (en) 2001-12-21 2004-01-29 Pritchard David E. Computer implemented tutoring system
US20040014017A1 (en) 2002-07-22 2004-01-22 Lo Howard Hou-Hao Effective and efficient learning (EEL) system
US20040024569A1 (en) 2002-08-02 2004-02-05 Camillo Philip Lee Performance proficiency evaluation method and system
US20040161728A1 (en) 2003-02-14 2004-08-19 Benevento Francis A. Distance learning system
US8182270B2 (en) 2003-07-31 2012-05-22 Intellectual Reserve, Inc. Systems and methods for providing a dynamic continual improvement educational environment
US20060068368A1 (en) 2004-08-20 2006-03-30 Mohler Sherman Q System and method for content packaging in a distributed learning system
US8326659B2 (en) 2005-04-12 2012-12-04 Blackboard Inc. Method and system for assessment within a multi-level organization
US20070009871A1 (en) * 2005-05-28 2007-01-11 Ctb/Mcgraw-Hill System and method for improved cumulative assessment
US20070111188A1 (en) * 2005-11-17 2007-05-17 Shell Timothy A Reference card creation system and method
US7930300B2 (en) * 2005-12-02 2011-04-19 Stephen Colbran Assessment of educational services
US8714986B2 (en) 2006-08-31 2014-05-06 Achieve3000, Inc. System and method for providing differentiated content based on skill level
US20080254438A1 (en) * 2007-04-12 2008-10-16 Microsoft Corporation Administrator guide to student activity for use in a computerized learning environment
US8251704B2 (en) * 2007-04-12 2012-08-28 Microsoft Corporation Instrumentation and schematization of learning application programs in a computerized learning environment
US7849043B2 (en) * 2007-04-12 2010-12-07 Microsoft Corporation Matching educational game players in a computerized learning environment
US20080254434A1 (en) * 2007-04-13 2008-10-16 Nathan Calvert Learning management system
US20100223267A1 (en) * 2009-02-27 2010-09-02 Accenture Global Services Gmbh Matching tools for use in attribute-based performance systems
US20100316986A1 (en) * 2009-06-12 2010-12-16 Microsoft Corporation Rubric-based assessment with personalized learning recommendations
US8456580B2 (en) * 2010-03-08 2013-06-04 Au Optronics Corporation Three-dimensional display and displaying method thereof
US9208155B2 (en) * 2011-09-09 2015-12-08 Rovi Technologies Corporation Adaptive recommendation system

Patent Citations (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6334779B1 (en) * 1994-03-24 2002-01-01 Ncr Corporation Computer-assisted curriculum
US5727950A (en) * 1996-05-22 1998-03-17 Netsage Corporation Agent based instruction system and method
US6146148A (en) * 1996-09-25 2000-11-14 Sylvan Learning Systems, Inc. Automated testing and electronic instructional delivery and student management system
US5957699A (en) * 1997-12-22 1999-09-28 Scientific Learning Corporation Remote computer-assisted professionally supervised teaching system
US6149441A (en) * 1998-11-06 2000-11-21 Technology For Connecticut, Inc. Computer-based educational system
US6301462B1 (en) * 1999-01-15 2001-10-09 Unext. Com Online collaborative apprenticeship
US20010031456A1 (en) * 1999-12-30 2001-10-18 Greg Cynaumon Education system and method for providing educational exercises and establishing an educational fund
US6505031B1 (en) * 2000-02-25 2003-01-07 Robert Slider System and method for providing a virtual school environment
US20020142278A1 (en) * 2001-03-29 2002-10-03 Whitehurst R. Alan Method and system for training in an adaptive manner
US20020188583A1 (en) * 2001-05-25 2002-12-12 Mark Rukavina E-learning tool for dynamically rendering course content
US20020192629A1 (en) * 2001-05-30 2002-12-19 Uri Shafrir Meaning equivalence instructional methodology (MEIM)
US20030003433A1 (en) * 2001-06-29 2003-01-02 Ignite, Inc. Method and system for constructive, modality focused learning
US20030039948A1 (en) * 2001-08-09 2003-02-27 Donahue Steven J. Voice enabled tutorial system and method
US7165054B2 (en) * 2001-09-14 2007-01-16 Knowledgextensions, Inc. Custom electronic learning system and method
US7677896B1 (en) * 2002-02-19 2010-03-16 Nishikant Sonwalkar System, method, and computer-readable medium for course structure design to support adaptive learning
US20040202987A1 (en) * 2003-02-14 2004-10-14 Scheuring Sylvia Tidwell System and method for creating, assessing, modifying, and using a learning map
US20050058978A1 (en) * 2003-09-12 2005-03-17 Benevento Francis A. Individualized learning system
US20050186550A1 (en) * 2004-02-23 2005-08-25 Mubina Gillani System and method for dynamic electronic learning based on continuing student assessments and responses
US20060068367A1 (en) * 2004-08-20 2006-03-30 Parke Helen M System and method for content management in a distributed learning system
US20060188860A1 (en) * 2005-02-24 2006-08-24 Altis Avante, Inc. On-task learning system and method
US20060218225A1 (en) * 2005-03-28 2006-09-28 Hee Voon George H Device for sharing social network information among users over a network
US20070203589A1 (en) * 2005-04-08 2007-08-30 Manyworlds, Inc. Adaptive Recombinant Process Methods
US20070099161A1 (en) * 2005-10-31 2007-05-03 Krebs Andreas S Dynamic learning courses
US20070202484A1 (en) * 2006-02-28 2007-08-30 Michael Toombs Method and System for Educating Individuals
US20070218446A1 (en) * 2006-03-03 2007-09-20 Burck Smith Student interaction management system
US20080014569A1 (en) * 2006-04-07 2008-01-17 Eleutian Technology, Llc Teacher Assisted Internet Learning
US20070300174A1 (en) * 2006-06-27 2007-12-27 Microsoft Corporation Monitoring group activities
US20100035225A1 (en) * 2006-07-11 2010-02-11 President And Fellows Of Harvard College Adaptive spaced teaching method and system
US20080134045A1 (en) * 2006-07-13 2008-06-05 Neustar, Inc. System and method for adaptively and dynamically configuring a graphical user interface of a mobile communication device
US20080038705A1 (en) * 2006-07-14 2008-02-14 Kerns Daniel R System and method for assessing student progress and delivering appropriate content
US20080222295A1 (en) * 2006-11-02 2008-09-11 Addnclick, Inc. Using internet content as a means to establish live social networks by linking internet users to each other who are simultaneously engaged in the same and/or similar content
US20090035733A1 (en) * 2007-08-01 2009-02-05 Shmuel Meitar Device, system, and method of adaptive teaching and learning
US20090186329A1 (en) * 2008-01-23 2009-07-23 Carol Connor Method for recommending a teaching plan in literacy education
US20100041007A1 (en) * 2008-08-13 2010-02-18 Chi Wang Method and System for Knowledge Diagnosis and Tutoring
US20110010328A1 (en) * 2009-07-10 2011-01-13 Medimpact Healthcare Systems, Inc. Modifying a Patient Adherence Score
US20110039249A1 (en) * 2009-08-14 2011-02-17 Ronald Jay Packard Systems and methods for producing, delivering and managing educational material
US20110126253A1 (en) * 2009-11-20 2011-05-26 At&T Intellectual Property I, L.P. Apparatus and method for managing a social network
US20110177483A1 (en) * 2010-01-15 2011-07-21 Catherine Needham Recommending competitive learning objects
US20120164621A1 (en) * 2010-01-15 2012-06-28 Nitzan Katz Facilitating targeted interaction in a networked learning environment
US20120164620A1 (en) * 2010-01-15 2012-06-28 Catherine Needham Recommending competitive learning objects
US20130095465A1 (en) * 2011-10-12 2013-04-18 Satish Menon Course skeleton for adaptive learning
US20130095461A1 (en) * 2011-10-12 2013-04-18 Satish Menon Course skeleton for adaptive learning

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110008758A1 (en) * 2009-07-08 2011-01-13 Target Brands, Inc. Training Simulator
US8353704B2 (en) * 2009-07-08 2013-01-15 Target Brands, Inc. Training simulator
US20110306028A1 (en) * 2010-06-15 2011-12-15 Galimore Sarah E Educational decision support system and associated methods
US20130078598A1 (en) * 2011-09-12 2013-03-28 Uq, Inc. Family and child account social networking
US8214691B1 (en) * 2011-09-30 2012-07-03 Google Inc. System and method for testing internet functionality of a computing device in a simulated internet environment
US20130311416A1 (en) * 2012-05-16 2013-11-21 Xerox Corporation Recommending training programs
US8731455B2 (en) 2012-08-21 2014-05-20 Minapsys Software Corporation Computer-implemented method for facilitating creation of an advanced digital communications network, and terminal, system and computer-readable medium for the same
US20180357703A1 (en) * 2013-03-15 2018-12-13 Sears Brands, L.L.C. Recommendations Based Upon Explicit User Similarity
US10769702B2 (en) * 2013-03-15 2020-09-08 Transform Sr Brands Llc Recommendations based upon explicit user similarity
US20160062562A1 (en) * 2014-08-30 2016-03-03 Apollo Education Group, Inc. Automatic processing with multi-selection interface
US9612720B2 (en) 2014-08-30 2017-04-04 Apollo Education Group, Inc. Automatic processing with multi-selection interface
US9665243B2 (en) * 2014-08-30 2017-05-30 Apollo Education Group, Inc. Mobile intelligent adaptation interface
CN113630461A (en) * 2021-08-05 2021-11-09 东南大学 Online collaborative learning user grouping method based on user interaction trust network

Also Published As

Publication number Publication date
CA2787133A1 (en) 2011-07-21
US9583016B2 (en) 2017-02-28
BR112012017226A8 (en) 2018-06-26
US20120164620A1 (en) 2012-06-28
CN102822882B (en) 2016-02-03
US20110177483A1 (en) 2011-07-21
CN102822882A (en) 2012-12-12
EP2524362A1 (en) 2012-11-21
US20110177480A1 (en) 2011-07-21
US20120164621A1 (en) 2012-06-28
US20130266922A1 (en) 2013-10-10
MX2012008206A (en) 2012-10-09
BR112012017226A2 (en) 2017-06-13
WO2011088412A1 (en) 2011-07-21

Similar Documents

Publication Publication Date Title
US9583016B2 (en) Facilitating targeted interaction in a networked learning environment
US10360809B2 (en) Course skeleton for adaptive learning
Shute et al. Review of computer‐based assessment for learning in elementary and secondary education
US20070224586A1 (en) Method and system for evaluating and matching educational content to a user
Querol Julián et al. The impact of online technologies and English medium instruction on university lectures in international learning contexts: A systematic review
KR20100042636A (en) Device, system, and method of adaptive teaching and learning
Thornes Creating an online tutorial to support information literacy and academic skills development.
De Marsico et al. A strategy to join adaptive and reputation-based social-collaborative e-learning, through the zone of proximal development
Andersson et al. Increasing interactivity in distance educations: Case studies Bangladesh and Sri Lanka
US20200228424A1 (en) Method and system for automated multidimensional assessment generation and delivery
Han Student modelling and adaptivity in web-based learning systems
Riad et al. Review of e-Learning Systems Convergence from Traditional Systems to Services based Adaptive and Intelligent Systems.
Kozierkiewicz-Hetmańska A method for scenario recommendation in intelligent e-learning systems
US20190019097A1 (en) Method and system for bayesian network-based standard or skill mastery determination using a collection of interim assessments
El-Bakry et al. Advanced technology for E-learning development
Fang et al. Effective college English teaching based on teacher-student interactive model
Blank et al. A web-based ITS for OO design
Ismaili et al. D-Learning and COVID-19 Crisis: Appraisal of Reactions and Means of Perpetuity
Thornes Creating an online tutorial to develop academic and research skills
MacKenzie The effects of online courses with multimedia in learners' perceived satisfaction and effectiveness of E-learning
Dias et al. Adaptive learning management system to support an intelligent tutoring module
Alkhuraiji Dynamic adaptive E-learning mechanism based on learning styles
US20200226944A1 (en) Method and system for automated multidimensional content selection and presentation
Li An intelligent and effective e-learning system that provides tailored lessons to students
Saeed Integration and acceptance of web 2.0 technologies in higher education

Legal Events

Date Code Title Description
AS Assignment

Owner name: APOLLO GROUP, INC., ARIZONA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KATZ, NITZAN;MENON, SATISH;REEL/FRAME:025743/0104

Effective date: 20110125

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: THE UNIVERSITY OF PHOENIX, INC., ARIZONA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:APOLLO EDUCATION GROUP, INC.;REEL/FRAME:053308/0512

Effective date: 20200626