[go: up one dir, main page]

WO2017065324A1 - Système, procédé et programme d'apprentissage de langue des signes - Google Patents

Système, procédé et programme d'apprentissage de langue des signes Download PDF

Info

Publication number
WO2017065324A1
WO2017065324A1 PCT/KR2015/010744 KR2015010744W WO2017065324A1 WO 2017065324 A1 WO2017065324 A1 WO 2017065324A1 KR 2015010744 W KR2015010744 W KR 2015010744W WO 2017065324 A1 WO2017065324 A1 WO 2017065324A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
word
sign language
sign
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2015/010744
Other languages
English (en)
Korean (ko)
Inventor
반호영
최용근
이수빈
양동석
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Neofect Co Ltd
Original Assignee
Neofect Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Neofect Co Ltd filed Critical Neofect Co Ltd
Priority to PCT/KR2015/010744 priority Critical patent/WO2017065324A1/fr
Priority to KR1020157029063A priority patent/KR101793607B1/ko
Publication of WO2017065324A1 publication Critical patent/WO2017065324A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute

Definitions

  • the present invention relates to a sign language education system, method, and program, and more particularly, to a system, method, and program for evaluating and teaching a sign language of a user performed according to an item using a body movement measuring device.
  • Visually impaired people can only receive information through hearing.
  • the visually impaired can obtain information only through human speech or sound effects in a video medium such as a TV. This is because information such as movement and behavior is not known.
  • broadcasting stations and the like may provide a service such as screen commentary broadcasting.
  • Hearing impaired people also communicate through sign language that is visually visible or through text on a computer or text on the page.
  • communication through texts often does not contain enough meaning due to differences in different grammar and expression systems. This is because communication through texts causes problems such as information distortion and loss.
  • the communication between the sign language and the non-sign language is mainly done through a sign language interpreter and a text method.
  • the method of interpreters is practically limited in cost. Therefore, when communicating with the hearing impaired, a simple form of text is used as the main medium. Due to the difference in sign language and grammar and expression system of Korean, the sign language is often unable to contain the meaning when conveying meaning in text.
  • sign language has many differences (eg, differences in word order) from the grammar of general languages (ie, non-sign language conversational languages). Therefore, people have difficulty learning.
  • sign language may not communicate properly if the operation is not correct, and even if the operation is accurate, if the sign language does not match the order may not be communicated.
  • the terminal extracts the specific question data provided; Extracting reference sign data corresponding to the item data; Receiving sensing data from at least one body movement measuring device; Generating input sign data by combining one or more pieces of the sensing data; Calculating a comparison result by comparing the input sign language data with reference sign data; And evaluating the item data based on the comparison result.
  • the extracting of the reference sign data may include extracting the reference sign data matched with the item data stored in the database in the terminal.
  • the extracting the reference sign language data may include: changing the word order of the item data to a sign language order; And searching and matching sign language data corresponding to each word of the item data.
  • the sensing data receiving step receiving the sensing data of the finger movement and wrist movement from the glove-type measuring device; And receiving sensing data from a measurement sensor device attached to each body unit.
  • the sensing data receiving step the step of requesting the user to perform a specific reference posture or standard movement; And determining an initial position of each body movement measuring device according to the reference posture or reference movement.
  • the input sign data generating step may be characterized by calculating the positional relationship between the left hand and the right hand by tracking the movement of the body unit on the basis of the initial position.
  • the body movement measuring device includes a vision sensor device
  • receiving the image obtained by the vision sensor device the image includes both hands of the user, image receiving step; And recognizing a positional relationship between the left hand and the right hand in the image.
  • the evaluating step may include calculating a matching rate between the reference sign language data and the input sign data of a word corresponding to the item data when the item data is a word.
  • the terminal compares one or more reference word data of the reference sign data with one or more input word data of the input sign data, and calculates a matching rate. ; Matching the input word data with the reference word data having the highest matching rate; Calculating a word matching result based on a matching rate with respect to each of the input word data; The word difference result is calculated by accumulating the distance difference between the input word data and the reference word data, wherein the distance difference is moved so that the specific word input data is placed in the same sentence position as the reference word data having the highest matching rate. A word count; And calculating an evaluation score by reflecting the word matching result and the word matching result.
  • the method may further include generating feedback data for the user based on the word matching result and the word matching result.
  • the method may further include determining a difficulty level of the next item data by reflecting the evaluation score.
  • a sign language education program is coupled to a hardware terminal to execute the sign language education method and is stored in a medium.
  • the user can be assessed whether the actual sign language is performed correctly, which can help improve sign language skills.
  • the user can be evaluated not only the accuracy of the operation of the word but also the accuracy of the sign language order of a specific sentence. Therefore, the user can learn the sign language order different from the Korean word order.
  • the present invention if only a database of different sign languages is added for each language, the user can learn the sign language of various countries through one terminal.
  • the positional relationship between the user's left hand and right hand can be obtained by using a vision sensor or tracking the movement of the body movement measuring device from the initial position. You can evaluate whether it is correct.
  • FIG. 1 is a block diagram of a sign language education system according to an embodiment of the present invention.
  • FIG. 2 is a flowchart illustrating a sign language education method according to an embodiment of the present invention.
  • FIG. 3 is a flowchart illustrating a process of evaluating question data, which is a sentence, according to an embodiment of the present invention.
  • FIG. 4 is an exemplary diagram for calculating word matching results based on the coincidence rate of words according to an embodiment of the present invention.
  • the body movement measuring device corresponds to a device for sensing movement of a user and transmitting movement data to the terminal through wired or wireless communication.
  • the body movement measuring device may be in the form of a wearable device, or may be a sensor device that can be attached to various positions of the body.
  • the body movement measuring device may be a device of various types that can measure the body movement of the user.
  • the body movement to be measured is a movement of the hand region
  • the body movement measuring apparatus may be implemented as a wearable type worn on the hand.
  • the body movement measuring device may be a sensor patch or the like that can be attached to each body region.
  • the body movement measuring device may include various measuring devices such as a wearable type (for example, a glove type) measuring device and a body type measuring sensor device.
  • FIG. 1 is a block diagram of a sign language education system according to an embodiment of the present invention.
  • Sign language education system is implemented by the internal configuration of the terminal 100.
  • the terminal 100 may be divided into a mobile terminal 100 and a fixed terminal 100 according to whether or not it is movable.
  • the terminal 100 may include all types of terminals 100 including the above configuration.
  • the terminal 100 may include a cellular phone, a PCS phone (Personal Communication Service phone), and a synchronous / asynchronous IMT-2000 (International Mobile Telecommunication-2000) corresponding to the mobile terminal 100.
  • Mobile terminal 100 Palm Personal Computer (PDA), Personal Digital Assistant (PDA), Smartphone, Smart Phone, WAP phone, Wireless Application Protocao phone, Mobile Game Machine, Tablet PC , A netbook, a notebook (Note Book), and the like, and a desktop PC, a television, and the like corresponding to the fixed terminal 100 may correspond.
  • Palm Personal Computer PDA
  • Personal Digital Assistant PDA
  • Smartphone Smart Phone
  • WAP phone Wireless Application Protocao phone
  • Mobile Game Machine Tablet PC
  • a netbook a notebook (Note Book), and the like
  • a desktop PC a television, and the like corresponding to the fixed terminal 100 may correspond.
  • the terminal 100 includes a control unit 110; Communication unit 130; And output unit 130; includes all or part.
  • the terminal 100 is not limited to the components described above, and may further include additional components.
  • the controller 110 typically controls the overall operation of the terminal 100. For example, it performs related control and processing for data communication, image processing for reproduction on the display unit, body movement evaluation (eg, evaluation of input sign data according to user's body movement), and the like. Various functions performed by the controller 110 will be described later.
  • the communication unit 130 performs a function of receiving sensing data from the body movement measuring apparatus 200. In addition, the communication unit 130 performs a function of transmitting the received sensing data to the control unit 110. In addition, the communication unit 130 may transmit an output according to the evaluation result calculated based on the sensing data to the body movement measuring apparatus 200.
  • the communication unit 130 includes a wireless communication unit which is connected to the body movement measuring apparatus 200 by wire and receives the movement data from the body movement measuring apparatus 200 through a wired communication unit or a wireless communication method for receiving data. can do.
  • the wireless communication unit wireless internet module; Or a near field communication module.
  • the wireless internet module refers to a module for wireless internet access and may be embedded or external to the terminal 100.
  • Wireless Internet technologies include Wireless LAN (Wi-Fi), Wireless Broadband (Wibro), World Interoperability for Microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA), long term evolution (LTE), LTE-A Long Term Evolution-Advanced and the like can be used.
  • the short range communication module refers to a module for short range communication.
  • Short range communication technologies include Bluetooth, BLE (Bluetooth Low Energy), Beacon, Radio Frequency Identification (RFID), Near Field Communication (NFC), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, NRF, etc. may be used.
  • the output unit 130 performs a function of outputting information to be provided to the user.
  • the output unit 130 may include a display unit, a sound output unit, and the like.
  • the display unit displays (outputs) information processed by the terminal 100.
  • the display unit may be implemented as a touch screen by being combined with a touch sensor.
  • the display unit may receive an input operation from the user through a touch operation.
  • the terminal 100 may select an item to be evaluated by receiving a user's touch manipulation at a point corresponding to the specific item data from the item list displayed on the screen.
  • Various embodiments described herein may be implemented in a recording medium readable by a computer or similar device using, for example, software, hardware or a combination thereof.
  • the terminal 100 may further include a memory.
  • the memory may store a program for the operation of the controller 110 and store input / output data or data generated during the performance evaluation of the body (for example, learning data for receiving and storing movement data). It may be.
  • the memory may be included in the controller 110.
  • the memory may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (e.g. SD or XD memory, etc.), RAM access memory (RAM), static random access memory (SRAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), programmable read-only memory (PROM), magnetic memory, magnetic disk, It may include at least one type of storage medium of the optical disk.
  • the mobile terminal 100 may operate in connection with a web storage that performs a storage function of the memory on the Internet.
  • the embodiments described herein include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), and field programmable gate arrays (FPGAs). It may be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and electrical units for performing other functions. In some cases, the embodiments described herein may be implemented by the controller 110 itself.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • embodiments such as the procedures and functions described herein may be implemented as separate software modules.
  • Each of the software modules may perform one or more functions and operations described herein.
  • the software code may be implemented as a software application written in a suitable programming language.
  • the software code may be stored in a memory and executed by the controller 110.
  • FIG. 2 is a flowchart illustrating a sign language education method according to an embodiment of the present invention.
  • the terminal 100 extracts and provides a specific item data (S100); Extracting reference sign data corresponding to the item data (S200); Receiving the sensing data from the at least one body movement measuring device 200 (S300); Generating input sign data by combining one or more pieces of the sensing data (S400); Comparing the input sign language data with reference sign data and calculating a comparison result (S500); And performing an evaluation on the item data based on the comparison result (S600).
  • Sign language education method according to an embodiment of the present invention will be described in order.
  • the terminal 100 extracts and provides specific item data (S100). That is, the terminal 100 extracts specific item data according to the item selection request received from the user. For example, the terminal 100 may perform a specific operation (for example, when the body movement measuring apparatus 200 is a hand wearable device) from the body movement measuring apparatus 200 from a user, a specific point at which specific item data is displayed on the screen. After the hand pointing operation, the user can select the item data by performing a manipulation operation of collecting the hand).
  • the display unit is a touch screen
  • the terminal 100 may display various item data lists on the screen, and receive and select a touch operation for a specific item data from a user. Thereafter, the terminal 100 may provide the selected item data to the user through the display unit or the audio output unit.
  • the terminal 100 extracts reference sign data corresponding to the item data (S200).
  • the reference sign data is data performed by comparison with a sign language operation (ie, input sign data) of a user acquired by the body movement measuring apparatus 200 and corresponds to a sign language expression having a meaning consistent with specific item data.
  • the terminal 100 may extract reference sign language data corresponding to the item data in various ways.
  • the standard sign language data extraction method is not limited to the method described below, and various methods may be applied.
  • the terminal 100 may extract the reference sign data stored by matching the item data specific to the internal database. That is, the terminal 100 may store sign language operation data (ie, reference sign data) corresponding to specific item data in the memory.
  • the terminal 100 may store a sign language action expression corresponding to the item data, which is a specific word or a simple expression (eg, a greeting expression, etc.), and may store a sign language action expression corresponding to the item data, which is a specific sentence.
  • the terminal 100 may store reference sign language data generated in advance in order of sign language order in a database.
  • the terminal 100 may match and store the sign language for each country with respect to the same item data in the database.
  • the terminal 100 may extract and present a sign language expression of a specific country set by a user as reference sign data.
  • the terminal 100 may directly generate reference sign data corresponding to the selected or extracted specific item data. That is, the reference sign language data extracting step (S200) may include: changing the word order of the item data to a sign language order; And searching and matching sign language data corresponding to each word of the item data.
  • the terminal 100 includes data about a sign language order rule, and may convert item data (ie, a text sentence) extracted based on the word order rule data into a sign language order.
  • the terminal 100 may first divide the sentence into word units and then rearrange the sentences. Partitioning by word can be performed based on spacing of sentences. Thereafter, the terminal 100 may search each word in a database to load a corresponding sign language expression, and may generate reference sign data by connecting one or more sign language expressions.
  • the terminal 100 receives sensing data from at least one body movement measuring apparatus 200 (S300).
  • the at least one body movement measuring device 200 may include a glove-type measuring device, a body-mounted measuring sensor device, a vision sensor device, and the like.
  • the glove-type measuring device can be worn on both hands of the user.
  • the attached measuring sensor device may be attached to both the lower arm (ie, the elbow to wrist) or the upper arm (that is, the shoulder to elbow) which are the body parts of the user utilized for sign language.
  • the body-mounted measuring sensor device may be attached to the upper and lower arms, respectively, to measure the bending state, the movement state, and the positional relationship of the arm.
  • the sensing data receiving step (S300) may include finger movement and wrist movement from the hand worn measuring device. Receiving sensing data; And receiving sensing data from an attached measuring sensor device attached to each body unit.
  • the hand wearable measuring device may be worn on both hands of the user, and may measure the bending state or movement of each finger, the bending or movement of the wrist (for example, the direction of the palm toward the wrist rotation, etc.).
  • the hand wearable measuring device may include a banding sensor, an IMU sensor, or the like, and measure the state or movement of the finger and the wrist.
  • the attachable measuring sensor device may include an inertial sensor, and may be attached to each body unit to calculate a movement. That is, the attached measuring sensor device attached to the upper arm and the lower arm acquires the sensing data through the inertial sensor and transmits the sensing data to the terminal 100, and the terminal 100 has a positional relationship between the upper arm and the lower arm based on the sensing data. Can be calculated. Also,
  • the vision sensor device may obtain image data for measuring the movement of the hand and arm by the terminal 100 as sensing data.
  • the hand wearable measuring device or the attached measuring sensor device may include an identification mark that can be recognized by the vision sensor device, the vision sensor device measures the movement of one or more identification marks to transmit to the terminal 100. Can be.
  • the step of receiving an image obtained by the vision sensor device the image includes both hands of the user, the image receiving step; And recognizing a positional relationship between the left hand and the right hand in the image.
  • the sensing data obtained by the vision sensor device may be used to calculate the positional relationship of both arms and hands of the user by the terminal 100, as described below.
  • the vision sensor device may be formed of a necklace type, a deck type disposed in front of a user, an internal mounting type of the terminal 100, or the like.
  • the vision sensor device may acquire the movement of both hands and arms in front of the user's body as an image.
  • the vision sensor is a deck type or a terminal 100 attached or embedded type, it is disposed in front of the user at the time of inputting a sign language expression of the item data (that is, at the time of input sign language data input) to include movement of both hands and arms of the user.
  • a front image can be obtained.
  • the terminal 100 combines one or more pieces of the sensing data to generate input sign data (S400). That is, the terminal 100 may combine the sensing data acquired by the one or more body movement measuring apparatus 200 to generate the movement of the entire body part to which the body movement measuring apparatus 200 is attached as input sign data.
  • the terminal 100 is the positional relationship between the upper arm and the lower arm corresponding to the shape of the arm (for example, the bent state of the arm) through the sensing data measured from the attached measuring sensor device attached to both arms. , The arrangement of each arm, and the type of movement of each arm) can be calculated.
  • the terminal 100 determines the shape of the user's hand at each viewpoint through sensing data (for example, each finger bending state, wrist bending state, wrist rotation state, etc.) received from the hand wearable measuring sensor device. Can be calculated. Thereafter, the terminal 100 may generate a total sign language expression (ie, input sign data) performed by the user by combining the arrangement state of each arm and the shape of each hand.
  • sensing data for example, each finger bending state, wrist bending state, wrist rotation state, etc.
  • the terminal 100 when the terminal 100 receives the sensing data (ie, image data) acquired by the vision sensor device, the terminal 100 analyzes the sensing image acquired through the vision sensor to determine the positional relationship of both hands or arms. Can be calculated.
  • the terminal 100 may generate a total sign language expression (ie, input sign data) performed by a user by combining the positional relationship of both hands or arms calculated through an image with an arm shape and a hand shape.
  • the terminal 100 when the terminal 100 receives a reference posture or a standard movement result from the user, the terminal 100 is based on the initial position (or reference position) identified through the reference posture or standard movement.
  • a relative position of each measurement point for example, a body point to which a body-mounted measuring sensor device is attached
  • the terminal 100 may calculate the positional relationship of both hands or both arms based on the relative position of both hands or both arms with respect to the reference position at the same time point, combined with the shape of the arm and the shape of the hand by the user. Can generate the entire sign language (ie, input sign data).
  • the terminal 100 compares input sign language data with reference sign data to calculate a comparison result (S500). That is, the terminal 100 may calculate a result by comparing the reference sign data (ie, a sign language operation as a reference) corresponding to the item data and the generated input sign data (that is, a sign language operation performed by a user). .
  • the reference sign language data may correspond to one or more pieces of data obtained by measuring a reference user's movement for a sign language operation corresponding to the corresponding word in a dictionary.
  • the acceleration sensor value, the gyro sensor value, the geomagnetic sensor value, and the image information value of each axis according to time (For example, data values such as position, tilt, and direction of a specific body part in the image acquired through the vision sensor device).
  • the terminal accumulates the sign language operations received from the user as input sign data from the user instead of one specific sensing data, and uses a statistical classification method or a machine learning classification method such as supporting vector machine, decision tree, or deep learning. Learning may be performed to generate a specific value or a specific range corresponding to the reference sign data.
  • the terminal may calculate and compare the difference or distance between the two data with respect to the change of the input sign data and the reference sign data over time. For example, the terminal may receive reference sign data through the operation of a real person wearing a measurement sensor device and store sensing values of the measurement sensor devices corresponding to each time point. Thereafter, the terminal may calculate a difference by comparing the sensing value of each measurement sensor device according to the input sign data with the sensing value of each measurement sensor device according to the reference sign data. In addition, the difference between the input sign data and the reference sign data may be calculated by changing a domain for data analysis (for example, changing the frequency domain).
  • the terminal may compare the reference sign data with the input sign data by applying a dynamic time wafer (DTW) algorithm.
  • DTW dynamic time wafer
  • the method of comparing the reference sign data and the input sign data is not limited thereto, and various methods such as the present invention may be applied.
  • the terminal 100 is based on the word corresponding to the item data
  • a coincidence rate between the sign language data and the input sign data can be calculated. That is, since the item data corresponds to one sign language expression and a separate word order expression does not coincide, it is possible to calculate the match ratio of the operation by comparing the reference sign data and the input sign data one to one.
  • the terminal 100 when the item data is a sentence, the terminal 100 is one or more reference word data of the reference sign data and one or more input word data of the input sign data Comparing with each other, calculating a matching rate; may include.
  • the user may input a sign language operation in a different order from the word order corresponding to the item data.
  • the terminal 100 In order to evaluate the accuracy of word order, the terminal 100 must identify which word in the input sign data corresponds to each word of the reference sign data. To this end, the terminal 100 divides the reference sign data and the input sign data into word units (ie, generates one or more reference word data and one or more input word data), and mutually crosses all the reference word data and all the input word data. By comparison, the coincidence rate can be calculated.
  • the terminal 100 may determine the input word data matching the specific reference word data in the evaluation performing step S600 described later through the matching rate calculated by performing mutual comparison between words.
  • the terminal receives a signal according to an operation of an electrical switch, a mechanical switch, or a pressure sensor provided or attached to a body movement measuring device from a user.
  • a user may input the end of a word to a terminal by operating a switch provided at a specific position of the glove-type measuring device (for example, the detection area of the glove-type measuring device which can be easily operated with the thumb).
  • the terminal may recognize the end of the word by recognizing it.
  • the terminal may recognize that a specific word is over when the operation for a specific word is maintained for a predetermined time.
  • the terminal recognizes that a particular word is terminated and the next word is performed when the time range generally performed for each word (or word) is exceeded. can do.
  • the terminal 100 evaluates the item data based on the comparison result (S600).
  • the terminal 100 may calculate a score corresponding to a comparison result (that is, a match rate between the reference word data and the input word data). have.
  • the terminal 100 evaluates the degree of word order matching based on the result of the matching rate through mutual comparison between the reference word data and the input word data, and evaluates the degree of word match between words.
  • the evaluation performing step (S600) as shown in Figure 3, the step of matching the input word data and the reference word data with the highest matching rate (S610); Computing a word order result by accumulating the distance difference between the matched input word data and the reference word data, wherein the distance difference is a reference word data (ie, matched reference word data) having the highest matching rate.
  • Step S620 which is the number of words moved to be placed in the same position as the sentence; Calculating a word matching result based on a matching rate with respect to each of the input word data (S630); And calculating an evaluation score based on the word matching result and the word matching result (S640).
  • the terminal 100 may match the input word data and the reference word data having the highest matching rate based on the comparison result between the reference word data and the input word data (S610). For example, as shown in FIG. 4, a match rate is calculated by comparing word phrase data of an input sentence corresponding to the input sign data with a reference sentence corresponding to the reference sign data, and a criterion having the highest match rate for each input phrase.
  • a word ie, input word 1 may be a reference word 3, input word 2 may be a reference word 2, and input word 3 may be a reference word 1).
  • the terminal 100 may calculate the word matching result by accumulating the distance difference between the matched input word data and the reference word data (S620).
  • the distance difference may be the number of words that are moved so that the specific input word data is placed in the same sentence as the reference word data having the highest matching rate (that is, the matched reference word data). Since the input word data having the highest matching rate with the reference word data is most likely to be an expression intended by the user to input the reference word data, the terminal 100 determines the distance difference between the reference word data with the highest matching rate and the input word data. By calculating the error degree of the word order or the coincidence rate of the word order can be calculated. For example, as shown in FIG. 4, input word 3 is moved by two words to be placed in place of reference word 1, and input word 1 is moved by two words to be placed in place of reference word 3.
  • the word match result which is the number of shifts (shifting), may correspond to four.
  • the terminal 100 may calculate a word matching result based on a matching rate with respect to each of the input word data (S630). That is, the terminal 100 may calculate a word matching result by reflecting a match rate between the input word data and the reference word data that are determined to match each other. The terminal 100 may include an evaluation criterion according to the coincidence rate, thereby calculating the word matching result. Thereafter, the terminal 100 may calculate an evaluation score by reflecting the word matching result and the word matching result (S640).
  • the method may further include generating feedback data for the user based on the word matching result and the word matching result. That is, the terminal 100 may provide the user with feedback (that is, an incorrect answer commentary) regarding which part of the input sign data input by the user is wrong. For example, if the input sign data is wrong in comparison with the reference sign data, the terminal 100 may provide a description of the correct answer order to the user. In addition, when the matching rate between the specific input word data and the reference word data matched to it is equal to or less than the specific value, the terminal 100 may provide the reference word data corresponding to the correct answer on the screen, It can be explained by displaying on the screen whether the motion is wrong.
  • the method may further include determining a difficulty level of the next item data by reflecting the evaluation score.
  • the terminal 100 may provide the item data corresponding to the user's level in order, thereby preventing the user from becoming less interested in the sign language learning.
  • the terminal 100 may calculate the movement of both arms of the user or the positional relationship of both hands through the initial position setting method. That is, the initial position of the attached measuring sensor device or the hand wearable measuring device can be set, and the relative position with respect to the initial position to which the device is attached can be measured at each time point to determine the movement of the user's arms or the positional relationship between the two hands. have.
  • the sensing data receiving step (S300) the step of requesting the user to perform a specific reference posture or reference movement; And determining an initial position of each body movement measuring apparatus 200 according to the reference posture or the standard movement. That is, the terminal 100 may set an initial position (reference position) for determining a sign language operation after receiving a specific reference posture or reference movement from the user. For example, the terminal 100 may request a dressing posture from the user, and thus may set the state at the time of performing the dressing posture of the user to the initial state.
  • the input sign data generating step (S400) by tracking the movement of the body unit based on the initial position, the left and right hand
  • the positional relationship can be calculated.
  • the attached measurement sensor device includes an inertial sensor
  • the state of a specific point in time may be determined by accumulating the movement of the user measured by the inertial sensor. That is, the position of the specific time point may be calculated based on the initial position based on the magnitude and direction of the acceleration and the inclination angle accumulated and measured by the measurement sensor device attached to the specific body part.
  • the terminal 100 may calculate the relative position of the first time point (the time of calculating the relative position for the first time after calculating the initial position) based on the initial position based on the sensing data measured at the first time point, and the second position.
  • the relative position of the viewpoint (relative position calculation point after the first viewpoint) may be calculated based on the sensing data measured at the second viewpoint based on the relative position of the first viewpoint.
  • Sign language teaching method may be implemented as a program (or application) to be executed in combination with the terminal 100 which is hardware and stored in a medium.
  • the above program is C, C ++, JAVA that the client's processor (CPU) can be read through the client's device interface in order for the terminal 100 to read the program and execute the methods implemented as a program.
  • It may include a code (Code) coded in a computer language, such as machine language.
  • code may include functional code associated with a function or the like that defines the necessary functions for executing the methods, and includes control procedure related control code necessary for the processor of the client to execute a predetermined procedure. can do.
  • the code may further include memory reference code for additional information or media required for the client's processor to execute the functions at which location (address address) of the client's internal or external memory should be referenced. have.
  • the code may be used to communicate with any other computer or server remotely using the communication module of the client. It may further include a communication related code for whether to communicate, what information or media should be transmitted and received during communication.
  • the stored medium is not a medium for storing data for a short time such as a register, a cache, a memory, but semi-permanently, and means a medium that can be read by the device.
  • examples of the storage medium include, but are not limited to, a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like. That is, the program may be stored in various recording media on various servers to which the client can access or various recording media on the client of the user. The media may also be distributed over network coupled computer systems so that the computer readable code is stored in a distributed fashion.
  • the user can be assessed whether the actual sign language is performed correctly, which can help improve sign language skills.
  • the user can be evaluated not only the accuracy of the operation of the word but also the accuracy of the sign language order of a specific sentence. Therefore, the user can learn the sign language order different from the Korean word order.
  • the present invention if only a database of different sign languages is added for each language, the user can learn the sign language of various countries through one terminal.
  • the present invention can evaluate whether the positional relationship between the user's two hands is correct.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Electrically Operated Instructional Devices (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un système, un procédé et un programme d'apprentissage de langue des signes. Un procédé d'apprentissage de langue des signes, selon un mode de réalisation de la présente invention, comprend : une étape (S100) durant laquelle un terminal extrait des données de question particulières et fournit lesdites données ; une étape (S200) d'extraction des données de langue des signes de référence correspondant aux données de question ; une étape (S300) consistant à recevoir des éléments de données de détection à partir d'un ou plusieurs dispositifs de mesure de mouvement corporel ; une étape (S400) consistant à combiner lesdits éléments de données de détection et à générer des données de langue des signes d'entrée ; une étape (S500) consistant à comparer les données de langue des signes d'entrée et les données de langue des signes de référence et à calculer un résultat de comparaison ; et une étape (S600) consistant à réaliser une évaluation par rapport aux données de question sur la base du résultat de comparaison. La présente invention permet d'améliorer la connaissance en langue des signes d'un utilisateur au moyen de l'évaluation de la précision des gestes de langage des signes réels réalisés par l'utilisateur.
PCT/KR2015/010744 2015-10-13 2015-10-13 Système, procédé et programme d'apprentissage de langue des signes Ceased WO2017065324A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/KR2015/010744 WO2017065324A1 (fr) 2015-10-13 2015-10-13 Système, procédé et programme d'apprentissage de langue des signes
KR1020157029063A KR101793607B1 (ko) 2015-10-13 2015-10-13 수화교육 시스템, 방법 및 프로그램

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2015/010744 WO2017065324A1 (fr) 2015-10-13 2015-10-13 Système, procédé et programme d'apprentissage de langue des signes

Publications (1)

Publication Number Publication Date
WO2017065324A1 true WO2017065324A1 (fr) 2017-04-20

Family

ID=58518341

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/010744 Ceased WO2017065324A1 (fr) 2015-10-13 2015-10-13 Système, procédé et programme d'apprentissage de langue des signes

Country Status (2)

Country Link
KR (1) KR101793607B1 (fr)
WO (1) WO2017065324A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110457673A (zh) * 2019-06-25 2019-11-15 北京奇艺世纪科技有限公司 一种自然语言转换为手语的方法及装置

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200094570A (ko) 2019-01-30 2020-08-07 한밭대학교 산학협력단 수화용 장갑과 언어 변환용 안경으로 이루어진 수화 통역 시스템
KR102436239B1 (ko) * 2020-11-28 2022-08-24 동서대학교 산학협력단 제스쳐 인식 기술을 이용한 손모양 매칭 vr 수화 교육 시스템
KR102576358B1 (ko) * 2022-12-23 2023-09-11 주식회사 케이엘큐브 수어 번역을 위한 학습데이터 생성 장치 및 그의 동작 방법

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR19990081035A (ko) * 1998-04-24 1999-11-15 정선종 시각 장애인과 청각 장애인의 의사 전달을위한 시스템 및 방법
JP2000330467A (ja) * 1999-05-18 2000-11-30 Hitachi Ltd 手話教育装置、手話教育方法、及び、手話教育方法が記録された記録媒体
US20030191779A1 (en) * 2002-04-05 2003-10-09 Hirohiko Sagawa Sign language education system and program therefor
KR20070081479A (ko) * 2006-02-13 2007-08-17 구자효 수화용 자가 학습기
KR100953979B1 (ko) * 2009-02-10 2010-04-21 김재현 수화 학습 시스템

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR19990081035A (ko) * 1998-04-24 1999-11-15 정선종 시각 장애인과 청각 장애인의 의사 전달을위한 시스템 및 방법
JP2000330467A (ja) * 1999-05-18 2000-11-30 Hitachi Ltd 手話教育装置、手話教育方法、及び、手話教育方法が記録された記録媒体
US20030191779A1 (en) * 2002-04-05 2003-10-09 Hirohiko Sagawa Sign language education system and program therefor
KR20070081479A (ko) * 2006-02-13 2007-08-17 구자효 수화용 자가 학습기
KR100953979B1 (ko) * 2009-02-10 2010-04-21 김재현 수화 학습 시스템

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110457673A (zh) * 2019-06-25 2019-11-15 北京奇艺世纪科技有限公司 一种自然语言转换为手语的方法及装置
CN110457673B (zh) * 2019-06-25 2023-12-19 北京奇艺世纪科技有限公司 一种自然语言转换为手语的方法及装置

Also Published As

Publication number Publication date
KR101793607B1 (ko) 2017-11-20
KR20170054198A (ko) 2017-05-17

Similar Documents

Publication Publication Date Title
US10446059B2 (en) Hand motion interpretation and communication apparatus
WO2020145678A1 (fr) Système et procédé de détection de langues multiples parlées
US20160042228A1 (en) Systems and methods for recognition and translation of gestures
US8793118B2 (en) Adaptive multimodal communication assist system
US10585488B2 (en) System, method, and apparatus for man-machine interaction
WO2020027540A1 (fr) Appareil et procédé de compréhension de langage naturel personnalisé
CN104850542B (zh) 非可听语音输入校正
CN109670174B (zh) 一种事件识别模型的训练方法和装置
US20230025776A1 (en) Reception apparatus, reception system, reception method, and storage medium
WO2020262800A1 (fr) Système et procédé d'automatisation de compréhension de langage naturel (nlu) pour un développement de compétence
WO2017065324A1 (fr) Système, procédé et programme d'apprentissage de langue des signes
Watanabe et al. Advantages and drawbacks of smartphones and tablets for visually impaired people——analysis of ICT user survey results——
WO2019190076A1 (fr) Procédé de suivi des yeux et terminal permettant la mise en œuvre dudit procédé
CN108877334A (zh) 一种语音搜题方法及电子设备
KR102009150B1 (ko) 수화 또는 지화 인식 장치 및 방법
WO2015037871A1 (fr) Système, serveur et terminal permettant de fournir un service de lecture vocale au moyen d'une reconnaissance de textes
EP3467820A1 (fr) Dispositif de traitement d'informations et procédé de traitement d'informations
CN111027353A (zh) 一种搜索内容的提取方法及电子设备
US11630574B2 (en) Screen control method for providing notification of objects having different meanings for each region and electronic device supporting same
WO2024043563A1 (fr) Système et procédé pour modèle d'apprentissage automatique profond utilisant des données personnelles en temps réel
CN113409770A (zh) 发音特征处理方法、装置、服务器及介质
WO2023219267A1 (fr) Système et procédé de détection de mot de réveil de niveau de trame indépendant de l'accent
JP2024001050A (ja) ポインティングに基づく情報提供方法およびシステム
AU2021101436A4 (en) Wearable sign language detection system
Biju et al. A review of factors that impact the design of a glove based wearable devices

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 20157029063

Country of ref document: KR

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15906288

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 23/08/2018)

122 Ep: pct application non-entry in european phase

Ref document number: 15906288

Country of ref document: EP

Kind code of ref document: A1