US20090162827A1 - Integrated assessment system for standards-based assessments - Google Patents
Integrated assessment system for standards-based assessments Download PDFInfo
- Publication number
- US20090162827A1 US20090162827A1 US12/222,385 US22238508A US2009162827A1 US 20090162827 A1 US20090162827 A1 US 20090162827A1 US 22238508 A US22238508 A US 22238508A US 2009162827 A1 US2009162827 A1 US 2009162827A1
- Authority
- US
- United States
- Prior art keywords
- test
- standards
- assessments
- assessment
- customized
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/28—Databases characterised by their database models, e.g. relational or object models
- G06F16/284—Relational databases
- G06F16/285—Clustering or classification
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B7/00—Electrically-operated teaching apparatus or devices working with questions and answers
- G09B7/02—Electrically-operated teaching apparatus or devices working with questions and answers of the type wherein the student is expected to construct an answer to the question which is presented or wherein the machine gives an answer to the question presented by a student
Definitions
- a major challenge facing educational programs in the 21 st century is to promote learning aimed at the achievement of valued goals or standards.
- an educational management system has been designed to assist programs to promote goal-directed standards-based learning for example as described in U.S. Pat. No. 6,322,366 and No. 6,468,085.
- the present patent application details additional innovations that enhance the usefulness of the system for learners involved in a variety of standards-based educational programs.
- a particularly important group of such learners is comprised of elementary and secondary school students receiving instruction aimed at the achievement of federal, state, and local standards
- Item Response Theory is used to estimate the probability that a learner will be ready to acquire capabilities reflecting goals that have not yet been mastered.
- Item Response Theory is used to estimate the probability that a learner will be ready to acquire capabilities reflecting goals that have not yet been mastered.
- the present invention can be summarized in an integrated assessment system for standards-based education.
- the system including: (a) Assessment planning within a Benchmark Planner, allowing for a series of customized assessments aligned to standards and delivered on a schedule determined by the user. (b) Automatic benchmark test generation wherein these tests are: (i) Part of an overall benchmark plan that may cover up to an entire academic year. (ii) Able to restrict specific items from being re-tested if included in an earlier benchmark test during that planning period. (iii) Alignment of test items in accordance with the specifications used in the selected benchmark plan. (iv) Capable of optimizing overall printed test length by associating a flexible number of items with a block of text to reduce overall test length and minimizing white space.
- Benchmark test review wherein the review process contains: (i) Agency review of tests constructed using the benchmark planner. (ii) Multi-phase review, allowing distinct groups to participate in the review process. (iii) Ability to align the items with the associated standard within the review process. (iv) Use of distinct phases of review, e.g. Not Reviewed, Accept, and Replace. (d) Manual construction of tests using Test Builder, including: (i) The ability to hand-enter test questions with text, equations and images, import printable tests, or search a pre-populated test item bank. (ii) Ability to create a standards-based assessment by aligning items with a standard, or to circumvent standard alignment. (iii) Multi-phase construction, e.g.
- Offline test administration including: (i) The ability to scan answer sheets using optical scanning technology using proprietary answer sheets, or to print and subsequently scan answer sheets on plain paper that have been filled out by the student. (ii) A scanner controlled by a client-side computer connected to the Internet. (iii) Software installed on the client computer, used to connect to the student assessment database to upload assessment item responses. (iv) Bar-coded answer sheets linking the answer sheet to a specific test ID. (v) An algorithm scoring lightest and darkest mark on an answer sheet, to enhance accuracy in determining what constitutes a marked response. (vi) The ability for the client to automatically submit images of scanned answer sheets in the event of a processing problem.
- Test administration including: (i) Test entry through an online student center using dual password identification at the student login and test login levels, or through classroom administration using handheld input devices. (ii) Real-time progress updates for the student during test administration. (iii) Ability to administer tests with students using either individual computers or hand-held response pads, obviating the need to have a computer for each student in the testing environment. Test administration using response pads further including: (1) Software installed on a single client computer, connected to a response pad receiver. (2) A central display device, either an instructor's computer or whiteboard, to display test questions, or handheld devices capable of displaying test questions for the student.
- the invention also may be summarized in an integrative data analysis system to promote standards mastery.
- the system includes: (a) Innovations making it possible to: (i) Combine multiple tests into a single assessment, (ii) Combine parts of tests to make a new test, and (iii) combine information from tests, class assignments, and other data sources into one scale. (b) The ability to estimate the test score needed to achieve standards mastery and to identify objectives to be mastered to achieve the required test score. (i) Required learning objective estimates are linked to the Benchmark Planner. (c) A Risk Assessment initiative to determine whether or not standards have been achieved. The initiative includes: (i) Predictive abilities to determine which students are on course to meet standards and which students are at risk for not meeting standards.
- Tests used to determine mastery may include multiple test types (e.g. benchmark, formative, and/or State test).
- FIGS. 1 , 4 , 7 , 9 , 11 , 13 , 15 , 17 , 20 , 23 , 28 , 29 , 31 , 33 , 35 , 37 , 39 , 41 , 43 , 45 , 47 , 49 , 56 , 58 , 60 , 63 , 65 and 69 are step diagrams setting forth the operation and construction of the integrated assessment system of the present invention.
- FIGS. 2 , 3 , 5 , 6 , 8 , 10 , 12 , 14 , 16 , 18 , 19 , 21 , 22 , 24 , 25 , 26 , 27 , 30 , 32 , 34 , 36 , 38 , 40 , 42 , 44 , 46 , 48 , 50 , 51 , 52 , 53 , 54 , 55 , 59 , 61 , 62 , 64 , 66 , 67 , 69 and 70 are illustrations of computer screens of the integrated assessment system of the present invention setting forth various steps the above step diagrams.
- Standards-based education requires that instruction be targeted toward the achievement of shared goals articulated in standards. For example, currently educational agencies across the nation are targeting instruction toward the achievement of shared goals reflected in state and local standards.
- the effective pursuit of shared goals in a given educational agency e.g., school district or charter school
- the desired consistency and continuity are typically reflected in the agency's curriculum and assessment plans, which spell out the goals of instruction and the sequence in which those goals will be pursued and assessed.
- An integrated assessment system is developed to support the construction, scheduling, and administration of customized assessments based on input from large numbers of teachers and administrators who are working together toward the achievement of shared goals.
- the system supports agency-wide assessment planning covering multiple assessments during the school year. It integrates planning with test construction, agency-wide assessment review, and agency-wide and classroom based test scheduling.
- the system supports agency-wide online and/or offline test administration and automated and/or manual scoring.
- a second form of planning is long-range planning involving multiple assessments designed to assess student performance related to curriculum plans coordinating instruction across multiple grades for an entire school year.
- Technology supporting this type of planning has been lacking.
- An innovative tool called Assessment or Benchmark Planner has been developed to support long-range planning involving multiple assessments aligned to standards.
- the Assessment or Benchmark Planner makes it possible to plan a series of customized assessments aligned to standards and to schedule the delivery of those assessments at successive times across the school year.
- the planning process begins with the selection of the year during which the plan will be in effect, and the subject and associated standards to which the assessment items called for in the plan will be aligned.
- the selected standards are often state standards. However, agencies have the option of entering their own local standards using Scale Builder technology.
- a Plan Transfer feature allows the user to transfer a plan from a previous year to the current year.
- a Copy feature allows the user to make a copy of a plan.
- the Transfer and Copy features provide planning continuity across time and across related planning initiatives.
- Benchmark Planner allows the agency to initiate the process of giving people a voice in the development process by specifying individuals who will have the responsibility of reviewing draft assessments developed in accordance with the plan. This is accomplished using the Set Test Reviewers feature.
- the completed plan can also be printed and disseminated to interested parties.
- the user has the option of specifying the number of items to be used to assess each standard (benchmark) to be measured on the test. For example, the user may indicate that four items should be selected to measure each standard. A projected delivery date for the assessment is also automatically recorded for the assessment. The standards potentially available for assessment are displayed in a series of check boxes. The user checks the standards to be assessed on each benchmark. In addition, the user may overwrite the number of items allotted to each standard. For example, suppose that the user has indicated that there should be four items for each standard assessed on the test. The user may overwrite the global specification of four items. For instance, the user might choose to include three items to measure achievement related to a particular standard.
- the user may indicate that the plan is complete and save the plan.
- the completion of all plans at a given grade level provides a multi-assessment benchmark plan covering an entire school year for that grade level.
- the completion of plans at multiple grade levels provides a global plan for assessments to be conducted at all of the selected grade levels.
- a benchmark test may be automatically generated using the Generate Benchmark Tests feature. This feature is unique in important ways. It treats each benchmark test as part of an overall benchmark plan that generally covers an entire school year. This treatment makes it possible to impose useful restrictions on the entire series of tests associated with the plan. If an item has appeared on a previous benchmark assessment, it can be restricted from appearing on subsequent tests. Likewise, if a previous test includes one or more questions referring to a particular text, the text and any questions referring to it can be excluded from subsequent benchmark assessments. The restrictions help to insure that student performance reflects student skill and not merely responses to specific test items.
- the process of generating a test is initiated by selecting the benchmark plan to be used in guiding the construction of the assessment. Next the user selects the item banks to be called upon to generate the test.
- the third step is to indicate a library which will be used to store and retrieve the test and to specify the subject and grade level for the test. Subject and grade-level information is used to automatically generate a title for the test.
- the final step is to press the Generate Test button.
- the system selects items aligned with standards in accordance with the specifications in the selected benchmark plan. It keeps items linked to a common text or image together, and it implements an algorithm to order items in ways that save paper required to print test booklets in the event that offline test administration is elected. If items needed to meet some of the requirements of the benchmark plan are unavailable, the system prints a report identifying the number of required items and indicating the standard(s) to which they are to be assigned. These reports can play an important role in guiding item development activities.
- Test review serves a special function in standards-based education because benchmark tests are designed to assess many students receiving instruction through the efforts of many educators. All of these educators have a stake in the assessment process. In order to meet the assessment needs of teachers and administrators responsible for student instruction, it is important for the views of all stakeholders to be represented in shaping benchmark assessments.
- a two-phase test review process makes it possible for stakeholders to review assessments and achieve needed modifications prior to test publication and administration.
- the first phase is called initial review. Any number of reviewers can participate in the initial review process. For example, a school district might designate all fifth-grade teachers as initial reviewers of a fifth-grade benchmark math test. By contrast, they might form a review committee to serve as initial reviewers.
- the second phase is a final review. There is only one final reviewer. The final reviewer has access to all of the initial reviews. It is the final reviewer's responsibility to produce a review that guides construction of the final version of the benchmark test.
- Both the initial and final reviews are carried out using the Test Review feature of the system.
- a review is initiated by selecting the benchmark assessment to be reviewed.
- the reviewer has the option of displaying certain categories of items. For example, initially a reviewer might wish to see all items. At a later time, a reviewer interested in checking his or her judgments might elect to see only those items that had been previously reviewed.
- Each item to be reviewed is displayed along with the standard to which it is aligned.
- the review status of the item is displayed following the item.
- a comment box is provided below the three status categories.
- the reviewer has the option of saving a review at any time.
- the reviewer also may delete the review.
- the reviewer may indicate that the review is complete.
- a final reviewer indicates that a review is complete, a message is sent to an assessment staff. After receiving the message, an assessment staff member goes over the final review with the final reviewer. This part of the review process is included to facilitate the construction of reliable and valid benchmark assessments.
- the assessment staff member activates a replace button, which appears next to each item for which the review status is Replace.
- the button is pressed, the appropriate item bank is searched and possible replacement items are displayed. An item may then be selected from the list of replacement items and inserted into the test. When replacements are completed, the test is ready for publication.
- Assessment options included in the system include the capability to construct assessments using a feature called Test Builder.
- Test Builder is used mainly at the classroom level to enable teachers to construct class quizzes and formative assessments used to guide instruction related to objectives that are immediate targets for instruction.
- Test construction is initiated in Test Builder by entering the title of the test and selecting a test library in which to store the test for later retrieval. The next step is to enter general test-taking instructions.
- One option is to import a printable version of the entire test from a word-processing file. This option provides the capability to print test booklets and automatically score scanned answer sheets for assessments developed outside the system and administered offline.
- the second option is to construct items using text editing, equation editing, and image importation features provided in Test Builder.
- This option allows users to construct their own items, and to edit, or delete items from the test.
- the third option is to search recorded item banks.
- search options are available. Users may search for items aligned to a particular objective. They may conduct a key word search of objectives, and they may search for groups of items all linked to the same text or image. Search features also include the capability to automatically generate all or part of a test by designating the objectives to be included in the test and the number of items to be included for each of the selected objectives. Items selected from the banks may be copied. The copies may be edited to customize the items to user needs.
- each item included in an assessment constructed using Test Builder is aligned to a standard. As indicated earlier, all items in The banks are aligned to standards. Thus, items selected from The banks for inclusion on a test constructed using Test Builder are aligned to standards. When a new item is constructed in Test Builder, the user is requested to select the standard to which the item is aligned.
- the system is specifically designed to support standards-based assessment, there are cases in which users may not wish to align items to standards. The system allows for the creation of tests that are not aligned to standards. This is accomplished by creating a dummy objective for each item on the test.
- test When a new test is developed, the test is automatically assigned a status labeled Construction Phase. When construction is completed, the user may make a series of changes in the status. Following construction, the user may change the test status to Tryout Phase. During the Tryout Phase the test may be scheduled for tryout administration, but scores will not be saved. The user also has the option to change test status to Item Review Phase. In the Item Review Phase the test may be subjected to test review in the manner described for benchmark tests. Finally, the user may change the status to Publication Phase. When test status is changed to Publication Phase, the test can be scheduled and examinee responses will be saved. After a test is published, it cannot be changed or deleted. This rule assures the ability to trace examinee responses back to the test that was actually taken. Although a published test cannot be changed, it can be copied, and the copy can be edited.
- a Bulk Scheduler feature that makes it possible to schedule agency-wide assessments quickly and easily.
- Bulk Scheduler allows the user to schedule a test for all schools or selected schools in an agency. The user has the additional option of scheduling the test for all classes in the set of selected schools or selected classes in those schools.
- the user specifies the dates within which the test is scheduled for administration. The user may specify a user name and password for students who will be taking the test online. In addition, the user may specify a date for posting assessment results to appropriate audiences (e.g., students and parents).
- Test scheduling at the class level is accomplished using the Class Calendar feature. This feature allows the user to schedule the dates for a class test, a user name and password for the test, and the dates when scores will be posted. As soon as scheduling information has been entered, it appears along with other events on the teacher's Class Calendar.
- Scanline that includes the ability to scan answer sheets in order to support offline test administration.
- the ability to scan answer sheets has been well established for many years.
- Scanline supports established optical scanning technology using proprietary answer sheets.
- Scanline also supports more recently developed technology making it possible to print and subsequently scan plain paper answer sheets.
- Scanline requires a scanner controlled by a client-side computer connected to the Internet.
- Scanline software is downloaded from the Internet to the client machine.
- the software makes it possible to scan proprietary answer sheets and send information regarding examinee responses over the Internet to a server, which automatically scores the responses.
- the software may send information regarding student responses.
- scanned images may also be sent to a server.
- Scanline includes a number of innovations that increase the ease of use of plain-paper scanning technology and that enhance the ability to detect and correct scanning errors using the plain paper approach.
- Scanline technology identifies form types and form characteristics automatically. This feature enhances ease of use because it supports dynamic form printing and scanning.
- the answer sheet includes a barcode containing the test-administration ID.
- the barcode is read on the client machine. Web services are then called that indicate to the client the number of items on the test and the number of alternatives associated with each item. For example, the barcode might indicate that the test contained 35 items and that items 1 through 12 were true-false items and that the remaining items were 4-alternative multiple choice items.
- Dynamic scanning minimizes the time required to process scanning information. For example, if there are only 35 items on a test, the scanner would process only 35 items even though the form type might be capable of including many more items than 35.
- Plain paper scanning necessarily involves printing answer sheets on multiple printers.
- the output of printers may vary substantially along the light-to-dark dimension. This fact creates circumstances in which an unmarked alternative printed on one printer may be much darker or lighter than the same alternative printed on another printer.
- One approach for determining whether or not an alternative has been marked is to set darkness threshold expressed in pixels. If the alternative exceeds the threshold, it is classified as marked. Printer variation can make that approach unreliable. In the case in which printer output is dark, unmarked alternatives may exceed the threshold and be incorrectly classified as being marked.
- an algorithm uses the lightest mark for an alternative as an anchor against which to judge other alternatives.
- the current implementation of the algorithm recognizes three categories along the light-dark dimension: Light, regular, and dark.
- a multiplier is applied to establish the threshold for determining the percentage of pixels required in the annotated space to classify the bubble as marked.
- a different multiplier is used for each of the three categories in order to insure adequate classification accuracy.
- the thresholds for the three categories and the multipliers are determined by empirical tests.
- Scanline stores images of scanned answer sheets on the client machine.
- the user opens up a scanning history feature. This feature shows each scanned image and the status of the image. For example, if there is a scanning problem, the history indicates that an error has occurred and specifies the nature of the error.
- the user then has the option of making needed adjustments and rescanning the sheet or submitting the image a server. For example, if the sheet was initially scanned upside down, the user may choose to rescan it.
- Assessment options include online assessment as well as offline assessment. Moreover, both options are available within a single assessment. Online assessment is carried out in a virtual student center. A dual password approach provides two levels of security for online assessment. Each student is assigned a username and password that allows entry into the student center. A second username and password enables the student to enter the testing environment for the particular assessment that the student is scheduled to take. When the student logs into the test, his or her name appears at the top of the test. This helps proctors to insure that the students scheduled to take the test are actually the individuals who do take the test.
- the student is provided with instructions explaining how to navigate through the assessment, how to indicate her/his response, and how to review questions.
- the online testing feature affords flexible navigation, which enables the student to go to any item at any time.
- Contextual materials such as narratives or charts appear above the question to which they are attached. If the amount of such material is extensive, it appears in a window that permits the student to use a scroll bar to view all of the contextual content.
- Students may respond to items in a variety of ways depending on the type of item. For example, to respond to items in a multiple-choice format, the student points and clicks on the “radio button” next to her/his response and then clicks Save My Answer. For these and all other questions, the system will automatically take the student to the next question once the save button is clicked.
- test completion status bar indicates the proportion of items that have been completed, and a test summary screen lists the questions that have been answered and those that have not been answered. If the student has inadvertently omitted one or more items, the summary can alert the student to the omissions.
- Assessment innovations include a feature called Mercury that provides the ability to adminster assessments using hand-held response pad systems. These systems allow students to enter responses to assessment questions on hand-held units. The units then wirelessly transmit the student's responses to a receiver where they are read and recorded by Mercury, on the instructor's computer in a classroom or computer lab environment. Mercury includes features that increase the ease and efficiency with which devices of this type may be used for administration of assessments. These innovations also help to ensure that data is accurately collected, particularly in the event of technical problems such as hardware failure.
- Mercury includes an application that is installed on the computer located in the classroom being used to administer the assessments.
- the application has the ability to communicate directly and automatically with the Galileo database over the Internet via web services located on servers. Because this communication is seamless and automatic it eliminates several of the steps that would otherwise be required for the user. The advantages of this approach are described in detail in the following discussion of the steps require for this type of assessment administration.
- the first step that a teacher must take is to start the Mercury application.
- the program uses web services to communicate automatically with the Galileo database.
- the data that is passed to Mercury includes information such as the available students and the scheduled tests.
- the need for the teacher to take any extra steps to manually download information in order to start administration of a test is eliminated.
- the teacher is not required to log into the Galileo servers and download the available tests before they can be selected for administration.
- the list of tests is automatically available.
- the next step in administration is for the teacher to distribute the hand held units to the students. Students then start working on the assessment using the units to enter their answers to the questions. Their responses are received by the Mercury program and saved to the Galileo database via web services. This process is entirely automatic without the teacher being required to take any action. There is no need for manual uploading of student responses at the end of test administration. This increases the ease of use for the teacher because there are fewer tasks for them to perform. The accuracy of the resulting data will also be increased because there is no need for the teacher to remember and successfully perform the necessary steps to complete a manual data upload of student responses. Because the responses are being recorded continually, there is also greater protection in the event of hardware failure. For example, should the hard drive on the computer running the Mercury program fail in the middle of test administration, the data loss would be quite limited. Any student responses that had been entered prior to the time of the failure will have already been recorded on the Galileo servers.
- test monitoring is possible through the use of Galileo and Mercury. While a group of students is taking a test, teachers may log into the Galileo test administration screen or use links from within the Mercury application to view a monitoring screen showing all students currently taking the test, which questions have been answered, and whether each question was answered correctly or incorrectly.
- Standards-based assessment initiatives generally include assessment information gathered by multiple agencies. For example, under the No Child Left Behind Act, s nationwide assessments of standards mastery are required each year at specified grade levels. These tests are often accompanied by local standards-based assessment initiatives such as benchmark assessment programs implemented by local school districts. Although both types of assessment are typically aimed at measuring the mastery of state standards, the data from these assessments are generally not linked in ways that provide flexible data combinations to support accurate mastery classification and provide information that can be used to promote standards mastery.
- An integrative data analysis system links assessment data from local educational agencies to data from super ordinate agencies such as state departments of education in ways that promote accurate mastery classification and the achievement of shared goals such as those reflected in state standards.
- the present application introduces technology innovations based on existing ATI patents. These innovations make it possible to combine multiple tests into a single assessment, to combine parts of tests to make a new test, and to combine information from tests, class assignments, and other data sources into one scale. These combinatorial assessments can be utilized along with other assessment information to guide instruction toward the mastery of standards.
- ATI's combinatorial innovations have a number of practical benefits. Combining data from different sources into a single assessment can be expected to increase the reliability of the assessment because test reliability is a direct function of test length (e.g. Nunnally & Bernstein, 1994).
- the data sources being combined are those directly linked to instruction. For example, class assignments and class quizzes are routine features of instruction. When data gleaned from these instructional mainstays is combined in ways that yield psychometrically sound assessments, it is possible to assess the relationship between those assessments and other high-stakes assessments such as s nationwide tests used to determine student mastery of standards.
- the relationship between combinatorial assessments and high-stakes assessments provides a measure of the extent to which performance measured as part of instruction is assessing the same thing as high-stakes measures of student performance used to evaluate schools and students.
- a benchmark test may provide information suggesting an intervention targeting certain capabilities. After the intervention, a short formative assessment may be given to determine whether or not the targeted capabilities have been acquired. It may be useful to create and score a new combinatorial test substituting scores on items from the quiz for corresponding scores on items from the benchmark. The new test might then be used to revise estimates of the risk that students may have of not meeting standards as measured on a s nationwide test.
- Ability scores computed from combinatorial assessments can play an important role in guiding instruction. Using the continuous IRT ability score and item parameters to estimate the probability that a student will be able to perform tasks reflecting the mastery of particular standards. This information can be used to determine what capabilities to target for instruction.
Abstract
A computer integrated assessment system for standards-based assessments wherein the assessments conform to recorded standards, the system includes algorithms for directing the generation of a plan of a series of customized assessments aligned to respective selected standards of the recorded standards wherein each of the customized assessments are set for different times within a period of time; and algorithms for enabling a user to generate a test in one of the customized assessments wherein one of the selected standards of the one customized assessment is displayed to the user during generation of questions for the test. During generation of tests, repeating of questions or questions on selected subject matter can be prevented. Individuals or groups can be selected to the customized assessments. Accuracy in grading of answer sheets by scanning is improved by an algorithm determining the lightest and darkest answer mark to determine the intended answer. Additionally student handheld devices can be used to answer questions in test with automatic grading or scoring.
Description
- This application is related to U.S. Pat. Nos. 6,322,366, 6,468,085 and 7,065,516 and U.S. patent application Ser. No. 11/009,708, all of which patents and application are expressly incorporated in their entirety herein. This application claims benefit under 35 U.S.C. § 119(e)(1) of U.S.
Provisional Patent Applications 60/963,675 and 60/963,676 which are expressly incorporated in their entirety herein. Additionally the disclosure in U.S. Published Patent Application 2003/00044762 is expressly incorporated in its entirety herein. - A major challenge facing educational programs in the 21st century is to promote learning aimed at the achievement of valued goals or standards. In an effort to assist educators to meet this challenge, an educational management system has been designed to assist programs to promote goal-directed standards-based learning for example as described in U.S. Pat. No. 6,322,366 and No. 6,468,085. The present patent application details additional innovations that enhance the usefulness of the system for learners involved in a variety of standards-based educational programs. A particularly important group of such learners is comprised of elementary and secondary school students receiving instruction aimed at the achievement of federal, state, and local standards
- In one educational management system, instruction to promote goal-directed learning is informed by assessment information indicating the capabilities that a learner has acquired and those that the learner will be ready to learn in the future. Item Response Theory (IRT) is used to estimate the probability that a learner will be ready to acquire capabilities reflecting goals that have not yet been mastered. There is room for innovations that enhance the construction of assessments and the use of assessment information to inform goal-directed standards-based learning.
- In the educational management system, instruction to promote goal-directed learning is informed by assessment information indicating the capabilities that a learner has acquired and those that the learner will be ready to learn in the future. Item Response Theory (IRT) is used to estimate the probability that a learner will be ready to acquire capabilities reflecting goals that have not yet been mastered. There are included innovations that enhance the construction of assessments and the use of assessment information to inform goal-directed standards-based learning.
- The present invention can be summarized in an integrated assessment system for standards-based education. The system including: (a) Assessment planning within a Benchmark Planner, allowing for a series of customized assessments aligned to standards and delivered on a schedule determined by the user. (b) Automatic benchmark test generation wherein these tests are: (i) Part of an overall benchmark plan that may cover up to an entire academic year. (ii) Able to restrict specific items from being re-tested if included in an earlier benchmark test during that planning period. (iii) Alignment of test items in accordance with the specifications used in the selected benchmark plan. (iv) Capable of optimizing overall printed test length by associating a flexible number of items with a block of text to reduce overall test length and minimizing white space. (c) Benchmark test review wherein the review process contains: (i) Agency review of tests constructed using the benchmark planner. (ii) Multi-phase review, allowing distinct groups to participate in the review process. (iii) Ability to align the items with the associated standard within the review process. (iv) Use of distinct phases of review, e.g. Not Reviewed, Accept, and Replace. (d) Manual construction of tests using Test Builder, including: (i) The ability to hand-enter test questions with text, equations and images, import printable tests, or search a pre-populated test item bank. (ii) Ability to create a standards-based assessment by aligning items with a standard, or to circumvent standard alignment. (iii) Multi-phase construction, e.g. Construction phase, tryout phase, item review, and publication phase. (e) Offline test administration including: (i) The ability to scan answer sheets using optical scanning technology using proprietary answer sheets, or to print and subsequently scan answer sheets on plain paper that have been filled out by the student. (ii) A scanner controlled by a client-side computer connected to the Internet. (iii) Software installed on the client computer, used to connect to the student assessment database to upload assessment item responses. (iv) Bar-coded answer sheets linking the answer sheet to a specific test ID. (v) An algorithm scoring lightest and darkest mark on an answer sheet, to enhance accuracy in determining what constitutes a marked response. (vi) The ability for the client to automatically submit images of scanned answer sheets in the event of a processing problem. (vii) Ability to use both offline and online administration within a single assessment. (f) Online test administration, including: (i) Test entry through an online student center using dual password identification at the student login and test login levels, or through classroom administration using handheld input devices. (ii) Real-time progress updates for the student during test administration. (iii) Ability to administer tests with students using either individual computers or hand-held response pads, obviating the need to have a computer for each student in the testing environment. Test administration using response pads further including: (1) Software installed on a single client computer, connected to a response pad receiver. (2) A central display device, either an instructor's computer or whiteboard, to display test questions, or handheld devices capable of displaying test questions for the student. (3) Handheld response pads using wireless technology to transfer student responses to the client-side application managing the student response pads. (4) Automated transfer of data from the client-side application managing the student response pads to the assessment database. (iv) Test questions and answers transferred to the client computer after each response is saved, reducing the risk of data loss in the event of hardware failure. (g) Test monitoring capability, with the ability for teachers to view individual student item responses, and notation of correct/incorrect responses by question. (g) Combinatorial assessments, where standards mastery may be determined from diverse data sources such as class assignments and online/offline assessments. (h) Combinatorial assessments make it possible to use item parameters along with a continuous ability score to compute the probability that a student will achieve any of a series of goals in a scale comprised of a set of goals.
- The invention also may be summarized in an integrative data analysis system to promote standards mastery. The system includes: (a) Innovations making it possible to: (i) Combine multiple tests into a single assessment, (ii) Combine parts of tests to make a new test, and (iii) combine information from tests, class assignments, and other data sources into one scale. (b) The ability to estimate the test score needed to achieve standards mastery and to identify objectives to be mastered to achieve the required test score. (i) Required learning objective estimates are linked to the Benchmark Planner. (c) A Risk Assessment initiative to determine whether or not standards have been achieved. The initiative includes: (i) Predictive abilities to determine which students are on course to meet standards and which students are at risk for not meeting standards. (ii) A model basing estimates of the risk that students will not meet standards on data gathered during a previous year. (iii) Assessment of the validity of estimates based on computation of new estimates when new ata have been collected on all measures involved in the assessment. (d) Use of multiple tests to determine standards mastery. Tests used to determine mastery may include multiple test types (e.g. benchmark, formative, and/or State test).
-
FIGS. 1 , 4, 7, 9, 11, 13, 15, 17, 20, 23, 28, 29, 31, 33, 35, 37, 39, 41, 43, 45, 47, 49, 56, 58, 60, 63, 65 and 69 are step diagrams setting forth the operation and construction of the integrated assessment system of the present invention. -
FIGS. 2 , 3, 5, 6, 8, 10, 12, 14, 16, 18, 19, 21, 22, 24, 25, 26, 27, 30, 32, 34, 36, 38, 40, 42, 44, 46, 48, 50, 51, 52, 53, 54, 55, 59, 61, 62, 64, 66, 67, 69 and 70 are illustrations of computer screens of the integrated assessment system of the present invention setting forth various steps the above step diagrams. - Standards-based education requires that instruction be targeted toward the achievement of shared goals articulated in standards. For example, currently educational agencies across the nation are targeting instruction toward the achievement of shared goals reflected in state and local standards. The effective pursuit of shared goals in a given educational agency (e.g., school district or charter school) calls for a coordinated agency-wide effort designed to insure consistency and continuity in instruction and assessment related to instruction. The desired consistency and continuity are typically reflected in the agency's curriculum and assessment plans, which spell out the goals of instruction and the sequence in which those goals will be pursued and assessed.
- An integrated assessment system is developed to support the construction, scheduling, and administration of customized assessments based on input from large numbers of teachers and administrators who are working together toward the achievement of shared goals. The system supports agency-wide assessment planning covering multiple assessments during the school year. It integrates planning with test construction, agency-wide assessment review, and agency-wide and classroom based test scheduling. The system supports agency-wide online and/or offline test administration and automated and/or manual scoring.
- Assessment Planning with Benchmark Planner
- Two kinds of assessment planning occur in standards-based education. The most familiar involves planning to meet the needs of individual students in the classroom setting. For example, a teacher may construct a brief classroom quiz to measure specific capabilities currently targeted for instruction. Performance on the quiz may be used as a basis for recommending specific instructional activities designed to promote mastery of the targeted skills.
- A second form of planning is long-range planning involving multiple assessments designed to assess student performance related to curriculum plans coordinating instruction across multiple grades for an entire school year. Technology supporting this type of planning has been lacking. An innovative tool called Assessment or Benchmark Planner has been developed to support long-range planning involving multiple assessments aligned to standards. The Assessment or Benchmark Planner makes it possible to plan a series of customized assessments aligned to standards and to schedule the delivery of those assessments at successive times across the school year. The planning process begins with the selection of the year during which the plan will be in effect, and the subject and associated standards to which the assessment items called for in the plan will be aligned. The selected standards are often state standards. However, agencies have the option of entering their own local standards using Scale Builder technology. A Plan Transfer feature allows the user to transfer a plan from a previous year to the current year. A Copy feature allows the user to make a copy of a plan. The Transfer and Copy features provide planning continuity across time and across related planning initiatives.
- Many people typically have a stake in the development of Benchmark assessments. The Benchmark Planner allows the agency to initiate the process of giving people a voice in the development process by specifying individuals who will have the responsibility of reviewing draft assessments developed in accordance with the plan. This is accomplished using the Set Test Reviewers feature. The completed plan can also be printed and disseminated to interested parties.
- For each benchmark test, the user has the option of specifying the number of items to be used to assess each standard (benchmark) to be measured on the test. For example, the user may indicate that four items should be selected to measure each standard. A projected delivery date for the assessment is also automatically recorded for the assessment. The standards potentially available for assessment are displayed in a series of check boxes. The user checks the standards to be assessed on each benchmark. In addition, the user may overwrite the number of items allotted to each standard. For example, suppose that the user has indicated that there should be four items for each standard assessed on the test. The user may overwrite the global specification of four items. For instance, the user might choose to include three items to measure achievement related to a particular standard.
- When the benchmark plan for one or more assessments is complete, the user may indicate that the plan is complete and save the plan. The completion of all plans at a given grade level provides a multi-assessment benchmark plan covering an entire school year for that grade level. The completion of plans at multiple grade levels provides a global plan for assessments to be conducted at all of the selected grade levels.
- When a benchmark plan for a given test has been completed, a benchmark test may be automatically generated using the Generate Benchmark Tests feature. This feature is unique in important ways. It treats each benchmark test as part of an overall benchmark plan that generally covers an entire school year. This treatment makes it possible to impose useful restrictions on the entire series of tests associated with the plan. If an item has appeared on a previous benchmark assessment, it can be restricted from appearing on subsequent tests. Likewise, if a previous test includes one or more questions referring to a particular text, the text and any questions referring to it can be excluded from subsequent benchmark assessments. The restrictions help to insure that student performance reflects student skill and not merely responses to specific test items.
- The process of generating a test is initiated by selecting the benchmark plan to be used in guiding the construction of the assessment. Next the user selects the item banks to be called upon to generate the test. The third step is to indicate a library which will be used to store and retrieve the test and to specify the subject and grade level for the test. Subject and grade-level information is used to automatically generate a title for the test. The final step is to press the Generate Test button.
- When the Generate Test button is pressed, the system selects items aligned with standards in accordance with the specifications in the selected benchmark plan. It keeps items linked to a common text or image together, and it implements an algorithm to order items in ways that save paper required to print test booklets in the event that offline test administration is elected. If items needed to meet some of the requirements of the benchmark plan are unavailable, the system prints a report identifying the number of required items and indicating the standard(s) to which they are to be assigned. These reports can play an important role in guiding item development activities.
- After a series of benchmark tests have been generated in the system, the agency for which the tests were constructed has the option of engaging in test review. Test review serves a special function in standards-based education because benchmark tests are designed to assess many students receiving instruction through the efforts of many educators. All of these educators have a stake in the assessment process. In order to meet the assessment needs of teachers and administrators responsible for student instruction, it is important for the views of all stakeholders to be represented in shaping benchmark assessments.
- A two-phase test review process makes it possible for stakeholders to review assessments and achieve needed modifications prior to test publication and administration. The first phase is called initial review. Any number of reviewers can participate in the initial review process. For example, a school district might designate all fifth-grade teachers as initial reviewers of a fifth-grade benchmark math test. By contrast, they might form a review committee to serve as initial reviewers. The second phase is a final review. There is only one final reviewer. The final reviewer has access to all of the initial reviews. It is the final reviewer's responsibility to produce a review that guides construction of the final version of the benchmark test.
- Both the initial and final reviews are carried out using the Test Review feature of the system. A review is initiated by selecting the benchmark assessment to be reviewed. The reviewer has the option of displaying certain categories of items. For example, initially a reviewer might wish to see all items. At a later time, a reviewer interested in checking his or her judgments might elect to see only those items that had been previously reviewed.
- Each item to be reviewed is displayed along with the standard to which it is aligned. The review status of the item is displayed following the item. There are three status categories, which may be selected by the reviewer: Not Reviewed, Accept, and Replace. Not Reviewed is initially selected for all items that have not yet been reviewed. A comment box is provided below the three status categories.
- The reviewer has the option of saving a review at any time. The reviewer also may delete the review. Finally, the reviewer may indicate that the review is complete. When a final reviewer indicates that a review is complete, a message is sent to an assessment staff. After receiving the message, an assessment staff member goes over the final review with the final reviewer. This part of the review process is included to facilitate the construction of reliable and valid benchmark assessments.
- The assessment staff member activates a replace button, which appears next to each item for which the review status is Replace. When the button is pressed, the appropriate item bank is searched and possible replacement items are displayed. An item may then be selected from the list of replacement items and inserted into the test. When replacements are completed, the test is ready for publication.
- Assessment options included in the system include the capability to construct assessments using a feature called Test Builder. Test Builder is used mainly at the classroom level to enable teachers to construct class quizzes and formative assessments used to guide instruction related to objectives that are immediate targets for instruction. Test construction is initiated in Test Builder by entering the title of the test and selecting a test library in which to store the test for later retrieval. The next step is to enter general test-taking instructions. There are three basic options for entering items into the test. One option is to import a printable version of the entire test from a word-processing file. This option provides the capability to print test booklets and automatically score scanned answer sheets for assessments developed outside the system and administered offline. The second option is to construct items using text editing, equation editing, and image importation features provided in Test Builder. This option allows users to construct their own items, and to edit, or delete items from the test. The third option is to search recorded item banks. A variety of search options are available. Users may search for items aligned to a particular objective. They may conduct a key word search of objectives, and they may search for groups of items all linked to the same text or image. Search features also include the capability to automatically generate all or part of a test by designating the objectives to be included in the test and the number of items to be included for each of the selected objectives. Items selected from the banks may be copied. The copies may be edited to customize the items to user needs.
- In keeping with the standards-based approach to assessment, each item included in an assessment constructed using Test Builder is aligned to a standard. As indicated earlier, all items in The banks are aligned to standards. Thus, items selected from The banks for inclusion on a test constructed using Test Builder are aligned to standards. When a new item is constructed in Test Builder, the user is requested to select the standard to which the item is aligned. Although the system is specifically designed to support standards-based assessment, there are cases in which users may not wish to align items to standards. The system allows for the creation of tests that are not aligned to standards. This is accomplished by creating a dummy objective for each item on the test.
- When a new test is developed, the test is automatically assigned a status labeled Construction Phase. When construction is completed, the user may make a series of changes in the status. Following construction, the user may change the test status to Tryout Phase. During the Tryout Phase the test may be scheduled for tryout administration, but scores will not be saved. The user also has the option to change test status to Item Review Phase. In the Item Review Phase the test may be subjected to test review in the manner described for benchmark tests. Finally, the user may change the status to Publication Phase. When test status is changed to Publication Phase, the test can be scheduled and examinee responses will be saved. After a test is published, it cannot be changed or deleted. This rule assures the ability to trace examinee responses back to the test that was actually taken. Although a published test cannot be changed, it can be copied, and the copy can be edited.
- After a test has been published, it can be scheduled for online and/or offline administration. Scheduling options are included to accommodate benchmark and classroom formative assessments. Since benchmark tests are typically administered agency wide, an approach that makes it possible to schedule tests for large numbers of schools and classes can save those involved in the scheduling process large amounts of time. A Bulk Scheduler feature that makes it possible to schedule agency-wide assessments quickly and easily. Bulk Scheduler allows the user to schedule a test for all schools or selected schools in an agency. The user has the additional option of scheduling the test for all classes in the set of selected schools or selected classes in those schools. After schools and classes have been chosen, the user specifies the dates within which the test is scheduled for administration. The user may specify a user name and password for students who will be taking the test online. In addition, the user may specify a date for posting assessment results to appropriate audiences (e.g., students and parents).
- Although benchmark tests are typically scheduled for large groups of students, other types of assessments such as class quizzes are appropriately scheduled at the class level. Test scheduling at the class level is accomplished using the Class Calendar feature. This feature allows the user to schedule the dates for a class test, a user name and password for the test, and the dates when scores will be posted. As soon as scheduling information has been entered, it appears along with other events on the teacher's Class Calendar.
- Offline Test Administration with Scanline
- Assessment innovations incorporate a feature called Scanline that includes the ability to scan answer sheets in order to support offline test administration. The ability to scan answer sheets has been well established for many years. Scanline supports established optical scanning technology using proprietary answer sheets. Scanline also supports more recently developed technology making it possible to print and subsequently scan plain paper answer sheets.
- The operation of Scanline requires a scanner controlled by a client-side computer connected to the Internet. Scanline software is downloaded from the Internet to the client machine. The software makes it possible to scan proprietary answer sheets and send information regarding examinee responses over the Internet to a server, which automatically scores the responses. In the case of plain paper scanning, the software may send information regarding student responses. However, scanned images may also be sent to a server.
- Scanline includes a number of innovations that increase the ease of use of plain-paper scanning technology and that enhance the ability to detect and correct scanning errors using the plain paper approach. Scanline technology identifies form types and form characteristics automatically. This feature enhances ease of use because it supports dynamic form printing and scanning. When a user prints an answer sheet for a selected test, the answer sheet includes a barcode containing the test-administration ID. When the sheet is subsequently scanned, the barcode is read on the client machine. Web services are then called that indicate to the client the number of items on the test and the number of alternatives associated with each item. For example, the barcode might indicate that the test contained 35 items and that
items 1 through 12 were true-false items and that the remaining items were 4-alternative multiple choice items. This information would be used to control the information processed by the scanner. Dynamic scanning minimizes the time required to process scanning information. For example, if there are only 35 items on a test, the scanner would process only 35 items even though the form type might be capable of including many more items than 35. - One of the special problems associated with plain-paper scanning involves the determination of what constitutes a marked response. Plain paper scanning necessarily involves printing answer sheets on multiple printers. The output of printers may vary substantially along the light-to-dark dimension. This fact creates circumstances in which an unmarked alternative printed on one printer may be much darker or lighter than the same alternative printed on another printer. One approach for determining whether or not an alternative has been marked is to set darkness threshold expressed in pixels. If the alternative exceeds the threshold, it is classified as marked. Printer variation can make that approach unreliable. In the case in which printer output is dark, unmarked alternatives may exceed the threshold and be incorrectly classified as being marked.
- To address the classification problem, an algorithm uses the lightest mark for an alternative as an anchor against which to judge other alternatives. The current implementation of the algorithm recognizes three categories along the light-dark dimension: Light, regular, and dark. When the classification for the lightest bubble has been established, a multiplier is applied to establish the threshold for determining the percentage of pixels required in the annotated space to classify the bubble as marked. A different multiplier is used for each of the three categories in order to insure adequate classification accuracy. The thresholds for the three categories and the multipliers are determined by empirical tests.
- When a user experiences a problem during plain-paper scanning, it is helpful for the user to be able to easily and clearly communicate the problem to technical support personnel. An innovation design assists users to address scanning problems. Scanline stores images of scanned answer sheets on the client machine. When a user encounters a problem, the user opens up a scanning history feature. This feature shows each scanned image and the status of the image. For example, if there is a scanning problem, the history indicates that an error has occurred and specifies the nature of the error. The user then has the option of making needed adjustments and rescanning the sheet or submitting the image a server. For example, if the sheet was initially scanned upside down, the user may choose to rescan it. On the other hand, if the nature of the problem is not readily apparent the user may send the image to a technical support specialist for further review. The capability to pinpoint and view problem images of previously scanned sheets and to send them over the Internet with one click increases scanning accuracy and greatly simplifies the task of identifying scanning errors.
- Assessment options include online assessment as well as offline assessment. Moreover, both options are available within a single assessment. Online assessment is carried out in a virtual student center. A dual password approach provides two levels of security for online assessment. Each student is assigned a username and password that allows entry into the student center. A second username and password enables the student to enter the testing environment for the particular assessment that the student is scheduled to take. When the student logs into the test, his or her name appears at the top of the test. This helps proctors to insure that the students scheduled to take the test are actually the individuals who do take the test.
- At the beginning of each online assessment, the student is provided with instructions explaining how to navigate through the assessment, how to indicate her/his response, and how to review questions. The online testing feature affords flexible navigation, which enables the student to go to any item at any time. Contextual materials such as narratives or charts appear above the question to which they are attached. If the amount of such material is extensive, it appears in a window that permits the student to use a scroll bar to view all of the contextual content. Students may respond to items in a variety of ways depending on the type of item. For example, to respond to items in a multiple-choice format, the student points and clicks on the “radio button” next to her/his response and then clicks Save My Answer. For these and all other questions, the system will automatically take the student to the next question once the save button is clicked.
- As the student proceeds through the test, the numbers for questions already answered are in gray and those not yet answered are in blue. A test completion status bar indicates the proportion of items that have been completed, and a test summary screen lists the questions that have been answered and those that have not been answered. If the student has inadvertently omitted one or more items, the summary can alert the student to the omissions.
- Online Assessment with Mercury
- Assessment innovations include a feature called Mercury that provides the ability to adminster assessments using hand-held response pad systems. These systems allow students to enter responses to assessment questions on hand-held units. The units then wirelessly transmit the student's responses to a receiver where they are read and recorded by Mercury, on the instructor's computer in a classroom or computer lab environment. Mercury includes features that increase the ease and efficiency with which devices of this type may be used for administration of assessments. These innovations also help to ensure that data is accurately collected, particularly in the event of technical problems such as hardware failure.
- Mercury includes an application that is installed on the computer located in the classroom being used to administer the assessments. The application has the ability to communicate directly and automatically with the Galileo database over the Internet via web services located on servers. Because this communication is seamless and automatic it eliminates several of the steps that would otherwise be required for the user. The advantages of this approach are described in detail in the following discussion of the steps require for this type of assessment administration.
- In order to administer an assessment, the first step that a teacher must take is to start the Mercury application. Once the program is started, it uses web services to communicate automatically with the Galileo database. The data that is passed to Mercury includes information such as the available students and the scheduled tests. The need for the teacher to take any extra steps to manually download information in order to start administration of a test is eliminated. For example, the teacher is not required to log into the Galileo servers and download the available tests before they can be selected for administration. The list of tests is automatically available.
- The next step in administration is for the teacher to distribute the hand held units to the students. Students then start working on the assessment using the units to enter their answers to the questions. Their responses are received by the Mercury program and saved to the Galileo database via web services. This process is entirely automatic without the teacher being required to take any action. There is no need for manual uploading of student responses at the end of test administration. This increases the ease of use for the teacher because there are fewer tasks for them to perform. The accuracy of the resulting data will also be increased because there is no need for the teacher to remember and successfully perform the necessary steps to complete a manual data upload of student responses. Because the responses are being recorded continually, there is also greater protection in the event of hardware failure. For example, should the hard drive on the computer running the Mercury program fail in the middle of test administration, the data loss would be quite limited. Any student responses that had been entered prior to the time of the failure will have already been recorded on the Galileo servers.
- With both methods of online test administration, test monitoring is possible through the use of Galileo and Mercury. While a group of students is taking a test, teachers may log into the Galileo test administration screen or use links from within the Mercury application to view a monitoring screen showing all students currently taking the test, which questions have been answered, and whether each question was answered correctly or incorrectly.
- Standards-based assessment initiatives generally include assessment information gathered by multiple agencies. For example, under the No Child Left Behind Act, statewide assessments of standards mastery are required each year at specified grade levels. These tests are often accompanied by local standards-based assessment initiatives such as benchmark assessment programs implemented by local school districts. Although both types of assessment are typically aimed at measuring the mastery of state standards, the data from these assessments are generally not linked in ways that provide flexible data combinations to support accurate mastery classification and provide information that can be used to promote standards mastery. An integrative data analysis system links assessment data from local educational agencies to data from super ordinate agencies such as state departments of education in ways that promote accurate mastery classification and the achievement of shared goals such as those reflected in state standards.
- Two kinds of data play a key role in standards-based initiatives: continuous data and categorical data. It is virtually universal practice in standards-based initiatives to provide test scores for an assessment on a continuous distribution. Standard Item Response Theory (IRT) techniques (e.g. Thissen & Wainer, 2001) is used to score tests such as benchmark assessments. In standards-based initiatives, it is also customary to segment the score continuum into categories to determine mastery of standards (e.g. Cizek, 2001). For example, a score continuum might be segmented into categories such as exceeds the standard, meets the standard, approaches the standard, and falls far below the standard. Ability scores such as those yielded using IRT lend themselves to statistical techniques appropriate for use with continuous data. By contrast, mastery classifications call for statistical procedures appropriate for use with categorical data. Integrative data analysis system accommodates scores of both types.
- Innovative technology involving a continuous score distribution is related to technology described in U.S. Pat. No. 6,322,366 B1 (Nov. 27, 2001) and U.S. Pat. No. 6,468,085 B1 (Oct. 22, 2002) in which a continuous ability score is used along with item parameters to compute the probability that a student will achieve any of a series of goals in a scale comprised of a set of goals. This technology makes it possible to determine standards mastery from diverse data sources. For example, mastery of one standard in a scale might determined by grades on a sample of work such as a class assignment. Another might come from online assessment.
- The present application introduces technology innovations based on existing ATI patents. These innovations make it possible to combine multiple tests into a single assessment, to combine parts of tests to make a new test, and to combine information from tests, class assignments, and other data sources into one scale. These combinatorial assessments can be utilized along with other assessment information to guide instruction toward the mastery of standards.
- ATI's combinatorial innovations have a number of practical benefits. Combining data from different sources into a single assessment can be expected to increase the reliability of the assessment because test reliability is a direct function of test length (e.g. Nunnally & Bernstein, 1994). The data sources being combined are those directly linked to instruction. For example, class assignments and class quizzes are routine features of instruction. When data gleaned from these instructional mainstays is combined in ways that yield psychometrically sound assessments, it is possible to assess the relationship between those assessments and other high-stakes assessments such as statewide tests used to determine student mastery of standards. The relationship between combinatorial assessments and high-stakes assessments provides a measure of the extent to which performance measured as part of instruction is assessing the same thing as high-stakes measures of student performance used to evaluate schools and students.
- In some cases it is useful to combine parts of benchmark tests with short quizzes. For example, a benchmark test may provide information suggesting an intervention targeting certain capabilities. After the intervention, a short formative assessment may be given to determine whether or not the targeted capabilities have been acquired. It may be useful to create and score a new combinatorial test substituting scores on items from the quiz for corresponding scores on items from the benchmark. The new test might then be used to revise estimates of the risk that students may have of not meeting standards as measured on a statewide test.
- Ability scores computed from combinatorial assessments can play an important role in guiding instruction. Using the continuous IRT ability score and item parameters to estimate the probability that a student will be able to perform tasks reflecting the mastery of particular standards. This information can be used to determine what capabilities to target for instruction.
- Many variations, modifications and changes may be made in the above described example without departing from the scope and spirit of the invention.
Claims (5)
1. A computer integrated assessment system for standards-based assessments wherein the assessments conform to recorded standards, the system comprising:
means for directing the generation of a plan of a series of customized assessments aligned to respective selected standards of the recorded standards wherein each of the customized assessments are set for different times within a period of time; and
means for enabling a user to generate a test in one of the customized assessments wherein one of the selected standards of the one customized assessment is displayed to the user during generation of questions for the test.
2. A computer integrated assessment system as defined in claim 1 further comprising:
means for preventing repeating a question in a subsequent assessments.
3. A computer integrated assessment system as defined in claim 1 further comprising:
means for allowing review of tests for each customized assessment by respective selected participants.
4. A computer integrated assessment system as defined in claim 1 further including the ability to scan test answer sheets wherein the lightest and darkest answer marks are determined and used to determine the marked answer for question.
5. A computer integrated assessment system as defined in claim 1 further including means to administer tests using student handheld input devices.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/222,385 US20090162827A1 (en) | 2007-08-07 | 2008-08-07 | Integrated assessment system for standards-based assessments |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US96367607P | 2007-08-07 | 2007-08-07 | |
US96367507P | 2007-08-07 | 2007-08-07 | |
US12/222,385 US20090162827A1 (en) | 2007-08-07 | 2008-08-07 | Integrated assessment system for standards-based assessments |
Publications (1)
Publication Number | Publication Date |
---|---|
US20090162827A1 true US20090162827A1 (en) | 2009-06-25 |
Family
ID=40789087
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/222,384 Expired - Fee Related US8630577B2 (en) | 2007-08-07 | 2008-08-07 | Item banking system for standards-based assessment |
US12/222,385 Abandoned US20090162827A1 (en) | 2007-08-07 | 2008-08-07 | Integrated assessment system for standards-based assessments |
US14/091,165 Abandoned US20140188849A1 (en) | 2007-08-07 | 2013-11-26 | Item banking system for standards-based assessment |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/222,384 Expired - Fee Related US8630577B2 (en) | 2007-08-07 | 2008-08-07 | Item banking system for standards-based assessment |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/091,165 Abandoned US20140188849A1 (en) | 2007-08-07 | 2013-11-26 | Item banking system for standards-based assessment |
Country Status (1)
Country | Link |
---|---|
US (3) | US8630577B2 (en) |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090269730A1 (en) * | 2008-04-28 | 2009-10-29 | Nexlearn, Llc | Simulation authoring tool |
US20100062411A1 (en) * | 2008-09-08 | 2010-03-11 | Rashad Jovan Bartholomew | Device system and method to provide feedback for educators |
US20100167257A1 (en) * | 2008-12-01 | 2010-07-01 | Hugh Norwood | Methods and systems for creating educational resources and aligning educational resources with benchmarks |
US20110066476A1 (en) * | 2009-09-15 | 2011-03-17 | Joseph Fernard Lewis | Business management assessment and consulting assistance system and associated method |
US20110200978A1 (en) * | 2010-02-16 | 2011-08-18 | Assessment Technology Incorporated | Online instructional dialog books |
US20110200979A1 (en) * | 2007-09-04 | 2011-08-18 | Brian Benson | Online instructional dialogs |
US20120077173A1 (en) * | 2010-09-24 | 2012-03-29 | Elizabeth Catherine Crawford | System for performing assessment without testing |
WO2012112390A1 (en) * | 2011-02-16 | 2012-08-23 | Knowledge Factor, Inc. | System and method for adaptive knowledge assessment and learning |
WO2013126701A1 (en) * | 2012-02-24 | 2013-08-29 | National Assoc. Of Boards Of Pharmacy | Test pallet assembly and family assignment |
US8784114B2 (en) | 2003-12-12 | 2014-07-22 | Assessment Technology, Inc. | Interactive computer system for instructor-student teaching and assessment of preschool children |
US9357329B2 (en) | 2011-10-25 | 2016-05-31 | Aquimo, Llc | Method to provide dynamic customized sports instruction responsive to motion of a mobile device |
WO2017074169A1 (en) * | 2015-10-28 | 2017-05-04 | PACHECO NAVARRO, Diana | Method implemented by a computing unit for the classification and pairing of mathematics learning packets with classes of students |
CN107240340A (en) * | 2017-07-20 | 2017-10-10 | 国网山东省电力公司潍坊供电公司 | Power network transformer substation system five-prevention operation practises checking device |
US10541884B2 (en) | 2017-10-12 | 2020-01-21 | Pearson Education, Inc. | Simulating a user score from input objectives |
US20200251007A1 (en) * | 2019-02-04 | 2020-08-06 | Pearson Education, Inc. | Systems and methods for item response modelling of digital assessments |
US10866956B2 (en) * | 2017-10-12 | 2020-12-15 | Pearson Education, Inc. | Optimizing user time and resources |
US20220180763A1 (en) * | 2011-06-01 | 2022-06-09 | D2L Corporation | Systems and methods for providing information incorporating reinforcement-based learning and feedback |
US11423035B2 (en) * | 2019-02-04 | 2022-08-23 | Pearson Education, Inc. | Scoring system for digital assessment quality with harmonic averaging |
US11422989B2 (en) | 2019-02-04 | 2022-08-23 | Pearson Education, Inc. | Scoring system for digital assessment quality |
US11443140B2 (en) | 2018-02-20 | 2022-09-13 | Pearson Education, Inc. | Systems and methods for automated machine learning model training for a custom authored prompt |
US11449762B2 (en) | 2018-02-20 | 2022-09-20 | Pearson Education, Inc. | Real time development of auto scoring essay models for custom created prompts |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8630577B2 (en) * | 2007-08-07 | 2014-01-14 | Assessment Technology Incorporated | Item banking system for standards-based assessment |
US20140188746A1 (en) * | 2012-12-31 | 2014-07-03 | Jiaojiao Li | Systems, methods, and devices for determining alignment of education content to one or more education standards |
US20140272892A1 (en) * | 2013-03-15 | 2014-09-18 | Edison Learning Inc. | On-line custom course builder |
JP6209098B2 (en) * | 2014-02-07 | 2017-10-04 | 富士通株式会社 | Data management program, data management method, and data management system |
CN105373564B (en) * | 2014-08-29 | 2018-06-19 | 深圳锐取信息技术股份有限公司 | Station table generating method and device are examined based on Objective Structured Clinical Examination frame |
US20210382865A1 (en) * | 2020-06-09 | 2021-12-09 | Act, Inc. | Secure model item tracking system |
US20220092690A1 (en) * | 2020-09-22 | 2022-03-24 | Briza, Inc. | Evaluation response system and method |
Citations (96)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4967354A (en) * | 1987-06-18 | 1990-10-30 | Tescor, Inc. | Method of preparing customized written examinations |
US5059127A (en) * | 1989-10-26 | 1991-10-22 | Educational Testing Service | Computerized mastery testing system, a computer administered variable length sequential testing system for making pass/fail decisions |
US5122952A (en) * | 1990-10-22 | 1992-06-16 | Minkus Leslie S | Method and apparatus for automated learning tool selection for child development |
US5173051A (en) * | 1991-10-15 | 1992-12-22 | Optical Data Corporation | Curriculum planning and publishing method |
US5174759A (en) * | 1988-08-04 | 1992-12-29 | Preston Frank S | TV animation interactively controlled by the viewer through input above a book page |
US5261823A (en) * | 1991-01-16 | 1993-11-16 | Brother Kogyo Kabushiki Kaisha | Electronic learning machine which is capable of giving learning problems matching the student's scholastic ability |
US5267865A (en) * | 1992-02-11 | 1993-12-07 | John R. Lee | Interactive computer aided natural learning method and apparatus |
US5295836A (en) * | 1990-09-03 | 1994-03-22 | Fujitsu Limited | Remote lecturing system |
US5310349A (en) * | 1992-04-30 | 1994-05-10 | Jostens Learning Corporation | Instructional management system |
US5326270A (en) * | 1991-08-29 | 1994-07-05 | Introspect Technologies, Inc. | System and method for assessing an individual's task-processing style |
US5411271A (en) * | 1994-01-03 | 1995-05-02 | Coastal Amusement Distributors, Inc. | Electronic video match game |
US5537587A (en) * | 1994-05-11 | 1996-07-16 | International Business Machines Corporation | File record maintenance in a data processing system by synchronization of menus with corresponding data files |
US5558520A (en) * | 1994-08-23 | 1996-09-24 | Werzberger; Bernice F. | Interactive book assembly |
US5727950A (en) * | 1996-05-22 | 1998-03-17 | Netsage Corporation | Agent based instruction system and method |
US5730604A (en) * | 1994-06-13 | 1998-03-24 | Mediaseek Technologies, Inc. | Method and apparatus for correlating educational requirements |
US5743746A (en) * | 1996-04-17 | 1998-04-28 | Ho; Chi Fai | Reward enriched learning system and method |
US5779486A (en) * | 1996-03-19 | 1998-07-14 | Ho; Chi Fai | Methods and apparatus to assess and enhance a student's understanding in a subject |
US5823788A (en) * | 1995-11-13 | 1998-10-20 | Lemelson; Jerome H. | Interactive educational system and method |
US5829983A (en) * | 1994-09-02 | 1998-11-03 | Fujitsu Limited | System for carrying out educational management |
US5835758A (en) * | 1995-02-28 | 1998-11-10 | Vidya Technologies, Inc. | Method and system for respresenting and processing physical and conceptual entities |
US5864869A (en) * | 1996-07-18 | 1999-01-26 | Doak; Ron K. | Method and manufacture of lesson plans and classroom organizers utilizing computers and software |
US5893717A (en) * | 1994-02-01 | 1999-04-13 | Educational Testing Service | Computerized method and system for teaching prose, document and quantitative literacy |
US5904485A (en) * | 1994-03-24 | 1999-05-18 | Ncr Corporation | Automated lesson selection and examination in computer-assisted education |
US5967793A (en) * | 1996-05-28 | 1999-10-19 | Ho; Chi Fai | Relationship-based computer-aided-educational system |
US6024577A (en) * | 1997-05-29 | 2000-02-15 | Fujitsu Limited | Network-based education system with capability to provide review material according to individual students' understanding levels |
US6029043A (en) * | 1998-01-29 | 2000-02-22 | Ho; Chi Fai | Computer-aided group-learning methods and systems |
US6030226A (en) * | 1996-03-27 | 2000-02-29 | Hersh; Michael | Application of multi-media technology to psychological and educational assessment tools |
US6044387A (en) * | 1997-09-10 | 2000-03-28 | Microsoft Corporation | Single command editing of multiple files |
US6042384A (en) * | 1998-06-30 | 2000-03-28 | Bookette Software Company | Computerized systems for optically scanning and electronically scoring and reporting test results |
US6077085A (en) * | 1998-05-19 | 2000-06-20 | Intellectual Reserve, Inc. | Technology assisted learning |
US6091930A (en) * | 1997-03-04 | 2000-07-18 | Case Western Reserve University | Customizable interactive textbook |
US6112049A (en) * | 1997-10-21 | 2000-08-29 | The Riverside Publishing Company | Computer network based testing system |
US6120300A (en) * | 1996-04-17 | 2000-09-19 | Ho; Chi Fai | Reward enriched learning system and method II |
US6139330A (en) * | 1996-03-19 | 2000-10-31 | Ho; Chi Fai | Computer-aided learning system and method |
US6149441A (en) * | 1998-11-06 | 2000-11-21 | Technology For Connecticut, Inc. | Computer-based educational system |
US6202060B1 (en) * | 1996-10-29 | 2001-03-13 | Bao Q. Tran | Data management system |
US6263434B1 (en) * | 1999-09-21 | 2001-07-17 | Sun Microsystems, Inc. | Signed group criteria |
US6261103B1 (en) * | 1999-04-15 | 2001-07-17 | Cb Sciences, Inc. | System for analyzing and/or effecting experimental data from a remote location |
US6269355B1 (en) * | 1999-04-15 | 2001-07-31 | Kadiri, Inc. | Automated process guidance system and method using knowledge management system |
US6305942B1 (en) * | 1998-11-12 | 2001-10-23 | Metalearning Systems, Inc. | Method and apparatus for increased language fluency through interactive comprehension, recognition and generation of sounds, words and sentences |
US20010034016A1 (en) * | 2000-02-10 | 2001-10-25 | Ziv-El Shimon G. | Method and system for online teaching using web pages |
US20010039000A1 (en) * | 2000-02-29 | 2001-11-08 | Parsons Thomas Gregory | Reading tutor for infants |
US6315572B1 (en) * | 1995-03-22 | 2001-11-13 | William M. Bancroft | Method and system for computerized authoring, learning, and evaluation |
US6322366B1 (en) * | 1998-06-30 | 2001-11-27 | Assessment Technology Inc. | Instructional management system |
US20010049085A1 (en) * | 1998-10-07 | 2001-12-06 | Cognitive Concepts, Inc. | Phonological awareness, phonological processing, and reading skill training system and method |
US6341212B1 (en) * | 1999-12-17 | 2002-01-22 | Virginia Foundation For Independent Colleges | System and method for certifying information technology skill through internet distribution examination |
US6353447B1 (en) * | 1999-01-26 | 2002-03-05 | Microsoft Corporation | Study planner system and method |
US6364666B1 (en) * | 1997-12-17 | 2002-04-02 | SCIENTIFIC LEARNîNG CORP. | Method for adaptive training of listening and language comprehension using processed speech within an animated story |
US20020076675A1 (en) * | 2000-09-28 | 2002-06-20 | Scientific Learning Corporation | Method and apparatus for automated training of language learning skills |
US6468085B1 (en) * | 2000-07-28 | 2002-10-22 | Assessment Technology Inc. | Scale builder and method |
US20020177109A1 (en) * | 2001-02-24 | 2002-11-28 | Torrance Robinson | System and method for creating, processing and managing educational content within and between schools |
US6496681B1 (en) * | 1999-09-22 | 2002-12-17 | Chet D. Linton | Method and system for accessing and interchanging multimedia data in an interactive format professional development platform |
US20020192631A1 (en) * | 2001-05-23 | 2002-12-19 | Chase Weir | Method and system for interactive teaching |
US6498920B1 (en) * | 2000-04-18 | 2002-12-24 | We-Comply, Inc. | Customizable web-based training system |
US20020199118A1 (en) * | 2001-02-02 | 2002-12-26 | Medinservice.Com, Inc. | Internet training course system and methods |
US6505031B1 (en) * | 2000-02-25 | 2003-01-07 | Robert Slider | System and method for providing a virtual school environment |
US20030017442A1 (en) * | 2001-06-15 | 2003-01-23 | Tudor William P. | Standards-based adaptive educational measurement and assessment system and method |
US20030027121A1 (en) * | 2001-08-01 | 2003-02-06 | Paul Grudnitski | Method and system for interactive case and video-based teacher training |
US20030039949A1 (en) * | 2001-04-23 | 2003-02-27 | David Cappellucci | Method and system for correlating a plurality of information resources |
US20030044762A1 (en) * | 2001-08-29 | 2003-03-06 | Assessment Technology Inc. | Educational management system |
US6535713B1 (en) * | 1996-05-09 | 2003-03-18 | Verizon Services Corp. | Interactive training application |
US6547568B1 (en) * | 2000-10-12 | 2003-04-15 | Kiyokazu Yamano | Education intermediary system and method |
US6554618B1 (en) * | 2001-04-20 | 2003-04-29 | Cheryl B. Lockwood | Managed integrated teaching providing individualized instruction |
US6561821B1 (en) * | 2001-12-28 | 2003-05-13 | Hon Hai Precision Ind. Co., Ltd. | High profile board-to-board electrical connector assembly |
US6561812B1 (en) * | 2000-10-30 | 2003-05-13 | Learncity, Inc. | System and method of correlating learning materials with educational objectives |
US6592379B1 (en) * | 1996-09-25 | 2003-07-15 | Sylvan Learning Systems, Inc. | Method for displaying instructional material during a learning session |
US20030180703A1 (en) * | 2002-01-28 | 2003-09-25 | Edusoft | Student assessment system |
US20040002039A1 (en) * | 2002-06-28 | 2004-01-01 | Accenture Global Services Gmbh, Of Switzerland | Course content development for business driven learning solutions |
US20040002049A1 (en) * | 2002-07-01 | 2004-01-01 | Jay Beavers | Computer network-based, interactive, multimedia learning system and process |
US6676413B1 (en) * | 2002-04-17 | 2004-01-13 | Voyager Expanded Learning, Inc. | Method and system for preventing illiteracy in substantially all members of a predetermined set |
US20040063085A1 (en) * | 2001-01-09 | 2004-04-01 | Dror Ivanir | Training system and method for improving user knowledge and skills |
US20040115608A1 (en) * | 2002-08-29 | 2004-06-17 | Paul Meyer | System and method for delivering, receiving and managing continuing educational and training services |
US6773266B1 (en) * | 1998-07-31 | 2004-08-10 | Athenium, L.L.C. | Method for implementing collaborative training and online learning over a computer network and related techniques |
US6789047B1 (en) * | 2001-04-17 | 2004-09-07 | Unext.Com Llc | Method and system for evaluating the performance of an instructor of an electronic course |
US6793129B2 (en) * | 2001-08-17 | 2004-09-21 | Leapfrog Enterprises, Inc. | Study aid apparatus and method of using study aid apparatus |
US20050110461A1 (en) * | 2000-08-01 | 2005-05-26 | Earthlink Communications | Mobile teaching system |
US20050221266A1 (en) * | 2004-04-02 | 2005-10-06 | Mislevy Robert J | System and method for assessment design |
US20050250087A1 (en) * | 1997-03-27 | 2005-11-10 | Driscoll Gary F | System and method for computer based creation of tests formatted to facilitate computer based testing |
US20060003306A1 (en) * | 2004-07-02 | 2006-01-05 | Mcginley Michael P | Unified web-based system for the delivery, scoring, and reporting of on-line and paper-based assessments |
US20060046237A1 (en) * | 2004-09-02 | 2006-03-02 | Griffin Charles W | Methods, systems and computer program products for creating and delivering prescriptive learning |
US20060078863A1 (en) * | 2001-02-09 | 2006-04-13 | Grow.Net, Inc. | System and method for processing test reports |
US20060172274A1 (en) * | 2004-12-30 | 2006-08-03 | Nolasco Norman J | System and method for real time tracking of student performance based on state educational standards |
US20060199163A1 (en) * | 2005-03-04 | 2006-09-07 | Johnson Andrea L | Dynamic teaching method |
US20060216683A1 (en) * | 2003-05-14 | 2006-09-28 | Goradia Gautam D | Interactive system for building, organising, and sharing one's own databank of questions and answers in a variety of questioning formats, on any subject in one or more languages |
US20060294552A1 (en) * | 2005-06-27 | 2006-12-28 | Renaissance Learning, Inc. | Audience response system and method |
US20070099169A1 (en) * | 2005-10-27 | 2007-05-03 | Darin Beamish | Software product and methods for recording and improving student performance |
US20070111180A1 (en) * | 2005-10-24 | 2007-05-17 | Sperle Robin U | Delivery methods for remote learning system courses |
US20070122788A1 (en) * | 2005-11-28 | 2007-05-31 | Microsoft Corporation | Virtual teaching assistant |
US20070160969A1 (en) * | 2006-01-11 | 2007-07-12 | Barton Benny M | Method and Apparatus for Associating User Evaluations with Independent Content Sources |
US20070178432A1 (en) * | 2006-02-02 | 2007-08-02 | Les Davis | Test management and assessment system and method |
US7311524B2 (en) * | 2002-01-17 | 2007-12-25 | Harcourt Assessment, Inc. | System and method assessing student achievement |
US20080040502A1 (en) * | 2006-07-11 | 2008-02-14 | Holsberry Richard T | Automated tracking of class attendance |
US7362997B2 (en) * | 2004-04-22 | 2008-04-22 | Aurelia Hartenberger | Methods and apparatus for curriculum planning |
US20080124696A1 (en) * | 2006-10-26 | 2008-05-29 | Houser Ronald L | Empirical development of learning content using educational measurement scales |
US20080187893A1 (en) * | 2007-02-02 | 2008-08-07 | Network For Instructional Tv, Inc. | Determining developmental progress for preschool children |
US20140188849A1 (en) * | 2007-08-07 | 2014-07-03 | Assessment Technology Incorporated | Item banking system for standards-based assessment |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6398556B1 (en) * | 1998-07-06 | 2002-06-04 | Chi Fai Ho | Inexpensive computer-aided learning methods and apparatus for learners |
US7031651B2 (en) | 2000-07-21 | 2006-04-18 | Englishtown, Inc. | System and method of matching teachers with students to facilitate conducting online private instruction over a global network |
US20020086271A1 (en) * | 2000-12-28 | 2002-07-04 | Murgia Paula J. | Interactive system for personal life patterns |
US20040076930A1 (en) * | 2002-02-22 | 2004-04-22 | Steinberg Linda S. | Partal assessment design system for educational testing |
US6808267B2 (en) * | 2002-10-18 | 2004-10-26 | Childsplay Vision Systems | Method for automated mass screening for visual dysfunction in children |
US20040229199A1 (en) * | 2003-04-16 | 2004-11-18 | Measured Progress, Inc. | Computer-based standardized test administration, scoring and analysis system |
CA2427786A1 (en) * | 2003-05-02 | 2004-11-02 | Auckland Uniservices Limited | System, method and computer program for student assessment |
US20050114160A1 (en) * | 2003-11-26 | 2005-05-26 | International Business Machines Corporation | Method, apparatus and computer program code for automation of assessment using rubrics |
US20050125196A1 (en) * | 2003-12-09 | 2005-06-09 | Len Swanson | Method and system for computer-assisted test construction performing specification matching during test item selection |
US20080038705A1 (en) * | 2006-07-14 | 2008-02-14 | Kerns Daniel R | System and method for assessing student progress and delivering appropriate content |
-
2008
- 2008-08-07 US US12/222,384 patent/US8630577B2/en not_active Expired - Fee Related
- 2008-08-07 US US12/222,385 patent/US20090162827A1/en not_active Abandoned
-
2013
- 2013-11-26 US US14/091,165 patent/US20140188849A1/en not_active Abandoned
Patent Citations (101)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4967354A (en) * | 1987-06-18 | 1990-10-30 | Tescor, Inc. | Method of preparing customized written examinations |
US5174759A (en) * | 1988-08-04 | 1992-12-29 | Preston Frank S | TV animation interactively controlled by the viewer through input above a book page |
US5059127A (en) * | 1989-10-26 | 1991-10-22 | Educational Testing Service | Computerized mastery testing system, a computer administered variable length sequential testing system for making pass/fail decisions |
US5295836A (en) * | 1990-09-03 | 1994-03-22 | Fujitsu Limited | Remote lecturing system |
US5122952A (en) * | 1990-10-22 | 1992-06-16 | Minkus Leslie S | Method and apparatus for automated learning tool selection for child development |
US5261823A (en) * | 1991-01-16 | 1993-11-16 | Brother Kogyo Kabushiki Kaisha | Electronic learning machine which is capable of giving learning problems matching the student's scholastic ability |
US5326270A (en) * | 1991-08-29 | 1994-07-05 | Introspect Technologies, Inc. | System and method for assessing an individual's task-processing style |
US5173051A (en) * | 1991-10-15 | 1992-12-22 | Optical Data Corporation | Curriculum planning and publishing method |
US5173051B1 (en) * | 1991-10-15 | 1997-06-10 | Optical Data Corp | Curriculum planning and publishing method |
US5267865A (en) * | 1992-02-11 | 1993-12-07 | John R. Lee | Interactive computer aided natural learning method and apparatus |
US5310349A (en) * | 1992-04-30 | 1994-05-10 | Jostens Learning Corporation | Instructional management system |
US5411271A (en) * | 1994-01-03 | 1995-05-02 | Coastal Amusement Distributors, Inc. | Electronic video match game |
US5893717A (en) * | 1994-02-01 | 1999-04-13 | Educational Testing Service | Computerized method and system for teaching prose, document and quantitative literacy |
US5904485A (en) * | 1994-03-24 | 1999-05-18 | Ncr Corporation | Automated lesson selection and examination in computer-assisted education |
US5537587A (en) * | 1994-05-11 | 1996-07-16 | International Business Machines Corporation | File record maintenance in a data processing system by synchronization of menus with corresponding data files |
US5823789A (en) * | 1994-06-13 | 1998-10-20 | Mediaseek Technologies, Inc. | Method and apparatus for correlating educational requirements |
US5730604A (en) * | 1994-06-13 | 1998-03-24 | Mediaseek Technologies, Inc. | Method and apparatus for correlating educational requirements |
US5558520A (en) * | 1994-08-23 | 1996-09-24 | Werzberger; Bernice F. | Interactive book assembly |
US5829983A (en) * | 1994-09-02 | 1998-11-03 | Fujitsu Limited | System for carrying out educational management |
US5835758A (en) * | 1995-02-28 | 1998-11-10 | Vidya Technologies, Inc. | Method and system for respresenting and processing physical and conceptual entities |
US6315572B1 (en) * | 1995-03-22 | 2001-11-13 | William M. Bancroft | Method and system for computerized authoring, learning, and evaluation |
US5823788A (en) * | 1995-11-13 | 1998-10-20 | Lemelson; Jerome H. | Interactive educational system and method |
US6118973A (en) * | 1996-03-19 | 2000-09-12 | Ho; Chi Fai | Methods and apparatus to assess and enhance a student's understanding in a subject |
US5779486A (en) * | 1996-03-19 | 1998-07-14 | Ho; Chi Fai | Methods and apparatus to assess and enhance a student's understanding in a subject |
US6139330A (en) * | 1996-03-19 | 2000-10-31 | Ho; Chi Fai | Computer-aided learning system and method |
US6030226A (en) * | 1996-03-27 | 2000-02-29 | Hersh; Michael | Application of multi-media technology to psychological and educational assessment tools |
US5743746A (en) * | 1996-04-17 | 1998-04-28 | Ho; Chi Fai | Reward enriched learning system and method |
US6120300A (en) * | 1996-04-17 | 2000-09-19 | Ho; Chi Fai | Reward enriched learning system and method II |
US6535713B1 (en) * | 1996-05-09 | 2003-03-18 | Verizon Services Corp. | Interactive training application |
US5727950A (en) * | 1996-05-22 | 1998-03-17 | Netsage Corporation | Agent based instruction system and method |
US5967793A (en) * | 1996-05-28 | 1999-10-19 | Ho; Chi Fai | Relationship-based computer-aided-educational system |
US5864869A (en) * | 1996-07-18 | 1999-01-26 | Doak; Ron K. | Method and manufacture of lesson plans and classroom organizers utilizing computers and software |
US6592379B1 (en) * | 1996-09-25 | 2003-07-15 | Sylvan Learning Systems, Inc. | Method for displaying instructional material during a learning session |
US6202060B1 (en) * | 1996-10-29 | 2001-03-13 | Bao Q. Tran | Data management system |
US6091930A (en) * | 1997-03-04 | 2000-07-18 | Case Western Reserve University | Customizable interactive textbook |
US20050250087A1 (en) * | 1997-03-27 | 2005-11-10 | Driscoll Gary F | System and method for computer based creation of tests formatted to facilitate computer based testing |
US6024577A (en) * | 1997-05-29 | 2000-02-15 | Fujitsu Limited | Network-based education system with capability to provide review material according to individual students' understanding levels |
US6044387A (en) * | 1997-09-10 | 2000-03-28 | Microsoft Corporation | Single command editing of multiple files |
US6418298B1 (en) * | 1997-10-21 | 2002-07-09 | The Riverside Publishing Co. | Computer network based testing system |
US6112049A (en) * | 1997-10-21 | 2000-08-29 | The Riverside Publishing Company | Computer network based testing system |
US6364666B1 (en) * | 1997-12-17 | 2002-04-02 | SCIENTIFIC LEARNîNG CORP. | Method for adaptive training of listening and language comprehension using processed speech within an animated story |
US6029043A (en) * | 1998-01-29 | 2000-02-22 | Ho; Chi Fai | Computer-aided group-learning methods and systems |
US6077085A (en) * | 1998-05-19 | 2000-06-20 | Intellectual Reserve, Inc. | Technology assisted learning |
US6042384A (en) * | 1998-06-30 | 2000-03-28 | Bookette Software Company | Computerized systems for optically scanning and electronically scoring and reporting test results |
US6322366B1 (en) * | 1998-06-30 | 2001-11-27 | Assessment Technology Inc. | Instructional management system |
US6773266B1 (en) * | 1998-07-31 | 2004-08-10 | Athenium, L.L.C. | Method for implementing collaborative training and online learning over a computer network and related techniques |
US20010049085A1 (en) * | 1998-10-07 | 2001-12-06 | Cognitive Concepts, Inc. | Phonological awareness, phonological processing, and reading skill training system and method |
US6149441A (en) * | 1998-11-06 | 2000-11-21 | Technology For Connecticut, Inc. | Computer-based educational system |
US6305942B1 (en) * | 1998-11-12 | 2001-10-23 | Metalearning Systems, Inc. | Method and apparatus for increased language fluency through interactive comprehension, recognition and generation of sounds, words and sentences |
US6353447B1 (en) * | 1999-01-26 | 2002-03-05 | Microsoft Corporation | Study planner system and method |
US6261103B1 (en) * | 1999-04-15 | 2001-07-17 | Cb Sciences, Inc. | System for analyzing and/or effecting experimental data from a remote location |
US6269355B1 (en) * | 1999-04-15 | 2001-07-31 | Kadiri, Inc. | Automated process guidance system and method using knowledge management system |
US6263434B1 (en) * | 1999-09-21 | 2001-07-17 | Sun Microsystems, Inc. | Signed group criteria |
US6496681B1 (en) * | 1999-09-22 | 2002-12-17 | Chet D. Linton | Method and system for accessing and interchanging multimedia data in an interactive format professional development platform |
US6341212B1 (en) * | 1999-12-17 | 2002-01-22 | Virginia Foundation For Independent Colleges | System and method for certifying information technology skill through internet distribution examination |
US20010034016A1 (en) * | 2000-02-10 | 2001-10-25 | Ziv-El Shimon G. | Method and system for online teaching using web pages |
US6505031B1 (en) * | 2000-02-25 | 2003-01-07 | Robert Slider | System and method for providing a virtual school environment |
US20010039000A1 (en) * | 2000-02-29 | 2001-11-08 | Parsons Thomas Gregory | Reading tutor for infants |
US6498920B1 (en) * | 2000-04-18 | 2002-12-24 | We-Comply, Inc. | Customizable web-based training system |
US6468085B1 (en) * | 2000-07-28 | 2002-10-22 | Assessment Technology Inc. | Scale builder and method |
US20050110461A1 (en) * | 2000-08-01 | 2005-05-26 | Earthlink Communications | Mobile teaching system |
US7160113B2 (en) * | 2000-08-01 | 2007-01-09 | Earthwalk Communication, Inc. | Mobile teaching system |
US20020076675A1 (en) * | 2000-09-28 | 2002-06-20 | Scientific Learning Corporation | Method and apparatus for automated training of language learning skills |
US6547568B1 (en) * | 2000-10-12 | 2003-04-15 | Kiyokazu Yamano | Education intermediary system and method |
US6561812B1 (en) * | 2000-10-30 | 2003-05-13 | Learncity, Inc. | System and method of correlating learning materials with educational objectives |
US20040063085A1 (en) * | 2001-01-09 | 2004-04-01 | Dror Ivanir | Training system and method for improving user knowledge and skills |
US20020199118A1 (en) * | 2001-02-02 | 2002-12-26 | Medinservice.Com, Inc. | Internet training course system and methods |
US20060078863A1 (en) * | 2001-02-09 | 2006-04-13 | Grow.Net, Inc. | System and method for processing test reports |
US20020177109A1 (en) * | 2001-02-24 | 2002-11-28 | Torrance Robinson | System and method for creating, processing and managing educational content within and between schools |
US6789047B1 (en) * | 2001-04-17 | 2004-09-07 | Unext.Com Llc | Method and system for evaluating the performance of an instructor of an electronic course |
US6554618B1 (en) * | 2001-04-20 | 2003-04-29 | Cheryl B. Lockwood | Managed integrated teaching providing individualized instruction |
US20030039949A1 (en) * | 2001-04-23 | 2003-02-27 | David Cappellucci | Method and system for correlating a plurality of information resources |
US20020192631A1 (en) * | 2001-05-23 | 2002-12-19 | Chase Weir | Method and system for interactive teaching |
US20030017442A1 (en) * | 2001-06-15 | 2003-01-23 | Tudor William P. | Standards-based adaptive educational measurement and assessment system and method |
US20030027121A1 (en) * | 2001-08-01 | 2003-02-06 | Paul Grudnitski | Method and system for interactive case and video-based teacher training |
US6793129B2 (en) * | 2001-08-17 | 2004-09-21 | Leapfrog Enterprises, Inc. | Study aid apparatus and method of using study aid apparatus |
US20030044762A1 (en) * | 2001-08-29 | 2003-03-06 | Assessment Technology Inc. | Educational management system |
US6561821B1 (en) * | 2001-12-28 | 2003-05-13 | Hon Hai Precision Ind. Co., Ltd. | High profile board-to-board electrical connector assembly |
US7311524B2 (en) * | 2002-01-17 | 2007-12-25 | Harcourt Assessment, Inc. | System and method assessing student achievement |
US20030180703A1 (en) * | 2002-01-28 | 2003-09-25 | Edusoft | Student assessment system |
US6676413B1 (en) * | 2002-04-17 | 2004-01-13 | Voyager Expanded Learning, Inc. | Method and system for preventing illiteracy in substantially all members of a predetermined set |
US20040002039A1 (en) * | 2002-06-28 | 2004-01-01 | Accenture Global Services Gmbh, Of Switzerland | Course content development for business driven learning solutions |
US20040002049A1 (en) * | 2002-07-01 | 2004-01-01 | Jay Beavers | Computer network-based, interactive, multimedia learning system and process |
US20040115608A1 (en) * | 2002-08-29 | 2004-06-17 | Paul Meyer | System and method for delivering, receiving and managing continuing educational and training services |
US20060216683A1 (en) * | 2003-05-14 | 2006-09-28 | Goradia Gautam D | Interactive system for building, organising, and sharing one's own databank of questions and answers in a variety of questioning formats, on any subject in one or more languages |
US20050221266A1 (en) * | 2004-04-02 | 2005-10-06 | Mislevy Robert J | System and method for assessment design |
US7362997B2 (en) * | 2004-04-22 | 2008-04-22 | Aurelia Hartenberger | Methods and apparatus for curriculum planning |
US20060003306A1 (en) * | 2004-07-02 | 2006-01-05 | Mcginley Michael P | Unified web-based system for the delivery, scoring, and reporting of on-line and paper-based assessments |
US20060046237A1 (en) * | 2004-09-02 | 2006-03-02 | Griffin Charles W | Methods, systems and computer program products for creating and delivering prescriptive learning |
US20060172274A1 (en) * | 2004-12-30 | 2006-08-03 | Nolasco Norman J | System and method for real time tracking of student performance based on state educational standards |
US20060199163A1 (en) * | 2005-03-04 | 2006-09-07 | Johnson Andrea L | Dynamic teaching method |
US20060294552A1 (en) * | 2005-06-27 | 2006-12-28 | Renaissance Learning, Inc. | Audience response system and method |
US20070111180A1 (en) * | 2005-10-24 | 2007-05-17 | Sperle Robin U | Delivery methods for remote learning system courses |
US20070099169A1 (en) * | 2005-10-27 | 2007-05-03 | Darin Beamish | Software product and methods for recording and improving student performance |
US20070122788A1 (en) * | 2005-11-28 | 2007-05-31 | Microsoft Corporation | Virtual teaching assistant |
US20070160969A1 (en) * | 2006-01-11 | 2007-07-12 | Barton Benny M | Method and Apparatus for Associating User Evaluations with Independent Content Sources |
US20070178432A1 (en) * | 2006-02-02 | 2007-08-02 | Les Davis | Test management and assessment system and method |
US20080040502A1 (en) * | 2006-07-11 | 2008-02-14 | Holsberry Richard T | Automated tracking of class attendance |
US20080124696A1 (en) * | 2006-10-26 | 2008-05-29 | Houser Ronald L | Empirical development of learning content using educational measurement scales |
US20080187893A1 (en) * | 2007-02-02 | 2008-08-07 | Network For Instructional Tv, Inc. | Determining developmental progress for preschool children |
US20140188849A1 (en) * | 2007-08-07 | 2014-07-03 | Assessment Technology Incorporated | Item banking system for standards-based assessment |
Cited By (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8784114B2 (en) | 2003-12-12 | 2014-07-22 | Assessment Technology, Inc. | Interactive computer system for instructor-student teaching and assessment of preschool children |
US20110200979A1 (en) * | 2007-09-04 | 2011-08-18 | Brian Benson | Online instructional dialogs |
US20090269730A1 (en) * | 2008-04-28 | 2009-10-29 | Nexlearn, Llc | Simulation authoring tool |
US8798522B2 (en) * | 2008-04-28 | 2014-08-05 | Nexlearn, Llc | Simulation authoring tool |
US20100062411A1 (en) * | 2008-09-08 | 2010-03-11 | Rashad Jovan Bartholomew | Device system and method to provide feedback for educators |
US20100167257A1 (en) * | 2008-12-01 | 2010-07-01 | Hugh Norwood | Methods and systems for creating educational resources and aligning educational resources with benchmarks |
US20110066476A1 (en) * | 2009-09-15 | 2011-03-17 | Joseph Fernard Lewis | Business management assessment and consulting assistance system and associated method |
US20110200978A1 (en) * | 2010-02-16 | 2011-08-18 | Assessment Technology Incorporated | Online instructional dialog books |
US20120077173A1 (en) * | 2010-09-24 | 2012-03-29 | Elizabeth Catherine Crawford | System for performing assessment without testing |
US9824603B2 (en) | 2010-09-24 | 2017-11-21 | Lexia Learning Systems Llc | System for performing assessment without testing |
US9299266B2 (en) * | 2010-09-24 | 2016-03-29 | Lexia Learning Systems Llc | System for performing assessment without testing |
US10586467B2 (en) | 2010-09-24 | 2020-03-10 | Lexia Learning Systems, Inc. | System for utilizing assessment without testing |
WO2012112390A1 (en) * | 2011-02-16 | 2012-08-23 | Knowledge Factor, Inc. | System and method for adaptive knowledge assessment and learning |
TWI474297B (en) * | 2011-02-16 | 2015-02-21 | Knowledge Factor Inc | System and method for adaptive knowledge assessment and learning |
TWI579813B (en) * | 2011-02-16 | 2017-04-21 | 知識要素公司 | System and method for adaptive knowledge assessment and learning |
US20220180763A1 (en) * | 2011-06-01 | 2022-06-09 | D2L Corporation | Systems and methods for providing information incorporating reinforcement-based learning and feedback |
US9357329B2 (en) | 2011-10-25 | 2016-05-31 | Aquimo, Llc | Method to provide dynamic customized sports instruction responsive to motion of a mobile device |
WO2013126701A1 (en) * | 2012-02-24 | 2013-08-29 | National Assoc. Of Boards Of Pharmacy | Test pallet assembly and family assignment |
US10522050B2 (en) | 2012-02-24 | 2019-12-31 | National Assoc. Of Boards Of Pharmacy | Test pallet assembly |
US9767707B2 (en) | 2012-02-24 | 2017-09-19 | National Assoc. Of Boards Of Pharmacy | Test pallet assembly and family assignment |
WO2017074169A1 (en) * | 2015-10-28 | 2017-05-04 | PACHECO NAVARRO, Diana | Method implemented by a computing unit for the classification and pairing of mathematics learning packets with classes of students |
CN107240340A (en) * | 2017-07-20 | 2017-10-10 | 国网山东省电力公司潍坊供电公司 | Power network transformer substation system five-prevention operation practises checking device |
US10541884B2 (en) | 2017-10-12 | 2020-01-21 | Pearson Education, Inc. | Simulating a user score from input objectives |
US10866956B2 (en) * | 2017-10-12 | 2020-12-15 | Pearson Education, Inc. | Optimizing user time and resources |
US11741849B2 (en) * | 2018-02-20 | 2023-08-29 | Pearson Education, Inc. | Systems and methods for interface-based machine learning model output customization |
US11875706B2 (en) | 2018-02-20 | 2024-01-16 | Pearson Education, Inc. | Systems and methods for automated machine learning model training quality control |
US11817014B2 (en) | 2018-02-20 | 2023-11-14 | Pearson Education, Inc. | Systems and methods for interface-based automated custom authored prompt evaluation |
US11443140B2 (en) | 2018-02-20 | 2022-09-13 | Pearson Education, Inc. | Systems and methods for automated machine learning model training for a custom authored prompt |
US11449762B2 (en) | 2018-02-20 | 2022-09-20 | Pearson Education, Inc. | Real time development of auto scoring essay models for custom created prompts |
US11475245B2 (en) | 2018-02-20 | 2022-10-18 | Pearson Education, Inc. | Systems and methods for automated evaluation model customization |
US20200251007A1 (en) * | 2019-02-04 | 2020-08-06 | Pearson Education, Inc. | Systems and methods for item response modelling of digital assessments |
US11422989B2 (en) | 2019-02-04 | 2022-08-23 | Pearson Education, Inc. | Scoring system for digital assessment quality |
US11854433B2 (en) * | 2019-02-04 | 2023-12-26 | Pearson Education, Inc. | Systems and methods for item response modelling of digital assessments |
US11423035B2 (en) * | 2019-02-04 | 2022-08-23 | Pearson Education, Inc. | Scoring system for digital assessment quality with harmonic averaging |
US11960493B2 (en) | 2019-02-04 | 2024-04-16 | Pearson Education, Inc. | Scoring system for digital assessment quality with harmonic averaging |
Also Published As
Publication number | Publication date |
---|---|
US8630577B2 (en) | 2014-01-14 |
US20090164406A1 (en) | 2009-06-25 |
US20140188849A1 (en) | 2014-07-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20090162827A1 (en) | Integrated assessment system for standards-based assessments | |
Belur et al. | A systematic review of police recruit training programmes | |
Cramp et al. | Lessons learned from implementing remotely invigilated online exams | |
Llamas-Nistal et al. | Blended e-assessment: Migrating classical exams to the digital world | |
US8187004B1 (en) | System and method of education administration | |
Long et al. | Performance assessments for beginning teachers: Options and lessons | |
Barri | What makes web-enhanced learning successful: four key elements | |
Smithson | Describing the Enacted Curriculum: Development and Dissemination of Opportunity To Learn Indicators in Science Education. | |
US20080046232A1 (en) | Method and System for E-tol English language test online | |
Stolte et al. | The reliability of non-cognitive admissions measures in predicting non-traditional doctor of pharmacy student performance outcomes | |
Wise et al. | Independent evaluation of the California high school exit examination (CAHSEE): Analysis of the 2001 administration | |
Tirrell | Examining the impact of Chickering's seven principles of good practice on student attrition in online courses in the community college | |
Horst | Measuring Achievement Gains in Educational Projects. | |
Laub | Computer-integrated learning system and elementary student achievement in mathematics: An evaluation study | |
Saunders | Meeting the Needs of Entering Students through Appropriate Placement in Entry-Level Writing Courses. | |
Heinze | Comprehensive assessment | |
Kralj et al. | Using PAT data to inform teaching and learning | |
Fuller | An evaluation of professional development on using student response systems and interactive whiteboards for formative assessment in the middle schools of a southeastern school district | |
Hockenberry et al. | How we came to dread Fridays: Developing an academic library assessment plan two hours at a time | |
Mause | Utilizing Parental Support and Resources to Improve Perceptions and Understanding of Standards-Based Grading and Reporting Practices. | |
Goldsmith | Technological Tools and Methods Used in Formative Assessment Activities | |
Malone et al. | Survey of K‐12 world language program evaluation | |
Gillespie et al. | ASSISTments Use During In-Person and Remote Instruction: A Case Study | |
Smith | Expanding Educational Technology Applications for Formative Assessment in Legal Education | |
Issa et al. | Factors Constraining Purchasing of Teaching and Learning Materials in Public Primary Schools in Dar Es Salaam-Tanzania |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: ASSESSMENT TECHNOLOGY INCORPORATED, ARIZONA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BENSON, BRIAN;BERGAN, JOHN ROBERT;FERRIS, LUKE;AND OTHERS;REEL/FRAME:027671/0643 Effective date: 20111129 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |