Backend/elastic/samples/compsci.json

11 lines
948 B
JSON

{
"id":"eeeb2d01-8315-454e-b33f-3d6caa25db45",
"title":"Lexical analysis [Wiki]",
"authors":["Moeid Heidari","Denis Gorbunov"],
"category": "Computer Science",
"topic":"Compiler Engineering",
"subTopic":"Lexical Analysis",
"summary":"Lorem Ipsum is simply dummy text of the printing and typesetting industry...",
"tags":["tag1","tag2","tag3"],
"content":"# Lexical Analysis\n In computer science, lexical analysis, lexing or tokenization is the process of converting a sequence of characters (such as in a computer program or web page) into a sequence of lexical tokens (strings with an assigned and thus identified meaning). A program that performs lexical analysis may be termed a lexer, tokenizer,[1] or scanner, although scanner is also a term for the first stage of a lexer. A lexer is generally combined with a parser, which together analyze the syntax of programming languages, web pages, and so forth."
}