I have the following ocamllex code:
let flt = ['-' '+']?['0'-'9']+ ['.'] ['0'-'9']+
rule token = parse
[' ' '\t' '\r' '\n'] { token lexbuf } (* Whitespace *)
| ['0'-'9']+ as lxm { INTEGER(int_of_string lxm) }
| flt as lxm { FLOAT(float_of_string lxm) }
This works!
But the minute I want to allow + and - signs for the INTEGER, it gives me an error.
let flt = ['-' '+']?['0'-'9']+ ['.'] ['0'-'9']+
rule token = parse
[' ' '\t' '\r' '\n'] { token lexbuf } (* Whitespace *)
| ['+' '-']['0'-'9']+ as lxm { INTEGER(int_of_string lxm) }
| flt as lxm { FLOAT(float_of_string lxm) }
The error is as follows:
Fatal error: exception Failure("int_of_string")
Undefined symbols for architecture x86_64:
"_main", referenced from:
implicit entry/start for main executable
ld: symbol(s) not found for architecture x86_64
The funny thing is that in my .ml file, I am using "float_of_string", but I am NOT using "int_of_string" anywhere.
int_of_string
does not handle leading+
signs, so you have to take that out before you pass your string toint_of_string
.