A lexer translates text to tokens which you can use in your application. With these tokens your could build your own programming language or a json parser for example. Since i'm fan of OOP you'll see that I applied an OO aproach. The code quality is good according to ChatGPT. According to valgrind there are no memory leaks. We're gonna create in this part:
token struct
token tests
Makefile
Requirements:
gcc
make
Writing token.h
Create a new file called "token.h".
Implement header protection:
#ifndef TOKEN_H_INCLUDED
#define TOKEN_H_INCLUDED
// Where the code goes
#endif
This will prevent that the file gets double included.
Import the headers we need:
#include <string.h>
#include <stdlib.h>
Define config:
#define TOKEN_LEXEME_SIZE 256
this means our token size is limited to 256 chars. It's not possible to use a huge string now. Dynamic memory allocation is too much to include in this tutorial.
Define the token struct:
typedef struct Token {
char lexeme[TOKEN_LEXEME_SIZE];
int line;
int col;
struct Token * next;
struct Token * prev;
} Token;
Implement new function. This will instantiate Token with default values.