Iikura et al., 2021 - Google Patents
CVAE-Based Complementary Story Generation Considering the Beginning and EndingIikura et al., 2021
- Document ID
- 7079436368037450771
- Author
- Iikura R
- Okada M
- Mori N
- Publication year
- Publication venue
- International Symposium on Distributed Computing and Artificial Intelligence
External Links
Snippet
We studied the problem of the computer-based generation of a well-coherent story. In this study, we propose a model based on a conditioned variational autoencoder that takes the first and final sentences of the story as input and generates the story complementarily. One …
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/20—Handling natural language data
- G06F17/27—Automatic analysis, e.g. parsing
- G06F17/2705—Parsing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/20—Handling natural language data
- G06F17/28—Processing or translating of natural language
- G06F17/2872—Rule based translation
- G06F17/2881—Natural language generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/20—Handling natural language data
- G06F17/27—Automatic analysis, e.g. parsing
- G06F17/2765—Recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/20—Handling natural language data
- G06F17/27—Automatic analysis, e.g. parsing
- G06F17/2785—Semantic analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N99/00—Subject matter not provided for in other groups of this subclass
- G06N99/005—Learning machines, i.e. computer in which a programme is changed according to experience gained by the machine itself during a complete run
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/20—Handling natural language data
- G06F17/21—Text processing
- G06F17/22—Manipulating or registering by use of codes, e.g. in sequence of text characters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/30—Information retrieval; Database structures therefor; File system structures therefor
- G06F17/3061—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/30—Information retrieval; Database structures therefor; File system structures therefor
- G06F17/30861—Retrieval from the Internet, e.g. browsers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computer systems utilising knowledge based models
- G06N5/02—Knowledge representation
- G06N5/022—Knowledge engineering, knowledge acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computer systems based on biological models
- G06N3/02—Computer systems based on biological models using neural network models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for programme control, e.g. control unit
- G06F9/06—Arrangements for programme control, e.g. control unit using stored programme, i.e. using internal store of processing equipment to receive and retain programme
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/08—Speech classification or search
- G10L15/18—Speech classification or search using natural language modelling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computer systems utilising knowledge based models
- G06N5/04—Inference methods or devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N7/00—Computer systems based on specific mathematical models
- G06N7/005—Probabilistic networks
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Yan et al. | A semantic and emotion‐based dual latent variable generation model for a dialogue system | |
| Badjatiya et al. | Attention-based neural text segmentation | |
| Pramanik et al. | Text normalization using memory augmented neural networks | |
| Wang et al. | Learning distributed word representations for bidirectional lstm recurrent neural network | |
| Tran et al. | Semantic refinement gru-based neural language generation for spoken dialogue systems | |
| Konstantinov et al. | Approach to the use of language models BERT and Word2vec in sentiment analysis of social network texts | |
| Naik et al. | Large data begets large data: studying large language models (LLMs) and its history, types, working, benefits and limitations | |
| Dasgupta et al. | A review of generative AI from historical perspectives | |
| Khan et al. | Empowering Urdu sentiment analysis: an attention-based stacked CNN-Bi-LSTM DNN with multilingual BERT | |
| Yan et al. | Leveraging contextual sentences for text classification by using a neural attention model | |
| Lin et al. | Multi-channel word embeddings for sentiment analysis | |
| Diao et al. | Multi-granularity bidirectional attention stream machine comprehension method for emotion cause extraction | |
| Gupta | A review of generative AI from historical perspectives | |
| Wang et al. | Improving text classification through pre-attention mechanism-derived lexicons | |
| Bensalah et al. | Combining word and character embeddings for Arabic chatbots | |
| Nouar et al. | A deep neural network model with multihop self-attention mechanism for topic segmentation of texts | |
| Wang et al. | KG-to-text generation with slot-attention and link-attention | |
| Waheed et al. | Domain-controlled title generation with human evaluation | |
| Ge et al. | The application of deep learning in automated essay evaluation | |
| Iikura et al. | CVAE-Based Complementary Story Generation Considering the Beginning and Ending | |
| Agarwal et al. | Next word prediction using hindi language | |
| Yu et al. | Multi‐scale event causality extraction via simultaneous knowledge‐attention and convolutional neural network | |
| Du et al. | Read then respond: multi-granularity grounding prediction for knowledge-grounded dialogue generation | |
| An et al. | Dialogue specific pre-training tasks for improved dialogue state tracking | |
| Graef | Leveraging Text Classification by Co-training with Bidirectional Language Models–A Novel Hybrid Approach and Its Application for a German Bank |