I have recently tried to make my own implementation of NEAT (Neuroevolution of Augmenting Topologies), and it seems to be getting stuck in some sort of local maximum when solving the XOR problem.
My implementation is able to add hidden nodes and connections as expected by the NEAT algorithm, and it is possible that my algorithm can solve it, as it is able to solve it on rare occasions after hundreds of generations.
All of my code for this project can be found at https://github.com/Maxwell-Hunt/NEAT, and the original paper on this algorithm can be found at http://nn.cs.utexas.edu/downloads/papers/stanley.ec02.pdf.
When I try to make it solve XOR, it usually is able to get the correct results: [0,0] -> [0], [0,1] -> [1], [1,0] -> [1], but it also gets that [1,1] -> 1 which is incorrect.
I have implemented NEAT in Ruby that does solve the XOR problem nicely.
https://github.com/flajann2/rubyneat
Please feel free to peruse and steal from my code as you like.
Debugging something like this can be tricky. For starters, pay close attention to your fitness formulation. I have not had a chance to look at your code, but it should provide something for partial answers, and you will see that in my code:
https://github.com/flajann2/rubyneat_examples
You may also want to pay close attention to how your sigmoid function works. It should not be too steep, etc.
Also, at issue may also be your selection and mating algorithms. If you do not get them correct, evolution may not commence.
My code takes about 100-200 runs to convere onto the correct solution for XOR. It should be a lot faster, however, and I know what I need to do to improve it, but it does work.
Hopefully, this helps.