I trying to optimize my model with 30000 variables and 1700 contraints, but i got this error´s when i put some more contraints.
n<-lp ("max", f.obj, f.con, f.dir, f.rhs)$solution
Error: cannot allocate vector of size 129.9 Mb
I´m working in win 32 bit, 2gb ram. What can i do to work and optimize my model using a large dataset?
That's a tiny machine by modern standards, and a non-tiny problem. Short answer is that you should run on a machine with a lot more RAM. Note that the problem isn't that R can't allocate 130 MB vectors in general -- it can -- it's that it's run out of memory on your specific machine.
I'd suggest running on a 64-bit instance of R 3.0 on a machine with 16 GB of RAM, and see if that helps.
You may want to look into spinning up a machine on the cloud, and using RStudio remotely, which will be a lot cheaper than buying a new computer.