Using Stanford CoreNLP

I am trying to get around using Stanford CoreNLP. I used some code from the Internet to understand what happens with the binding tool. I tried to run the project in Eclipse, but still ran into memory exception. I tried to increase the heap size, but there is no difference. Any ideas on why this is happening? Is this a code specific issue? Any use of CoreNLP would be great.

EDIT - code added

import edu.stanford.nlp.dcoref.CorefChain; import edu.stanford.nlp.dcoref.CorefCoreAnnotations; import edu.stanford.nlp.pipeline.Annotation; import edu.stanford.nlp.pipeline.StanfordCoreNLP; import java.util.Iterator; import java.util.Map; import java.util.Properties; public class testmain { public static void main(String[] args) { String text = "Viki is a smart boy. He knows a lot of things."; Annotation document = new Annotation(text); Properties props = new Properties(); props.put("annotators", "tokenize, ssplit, pos, parse, dcoref"); StanfordCoreNLP pipeline = new StanfordCoreNLP(props); pipeline.annotate(document); Map<Integer, CorefChain> graph = document.get(CorefCoreAnnotations.CorefChainAnnotation.class); Iterator<Integer> itr = graph.keySet().iterator(); while (itr.hasNext()) { String key = itr.next().toString(); String value = graph.get(key).toString(); System.out.println(key + " " + value); } } } 
+7
source share
3 answers

I found a similar problem when creating a small application using Stanford CoreNLP in Eclipse.
Increasing the heap size of Eclipse will not solve your problem.
After doing the search, this is the size of the ant build tool , which should be increased, but I have no idea how to do this.
So I give up on Eclipse and use Netbeans instead.

PS: You will end up with a memory exception with the default setting in Netbeans. But it can be easily solved by adjusting the -Xms setting for each application.

+4
source

Fix for eclipse: you can configure this in your eclipse preference as follows

  • Windows -> Preferences (for setting up mac it's: eclipse ->)
  • Java โ†’ Installed JRE
  • Select JRE and click Edit.
  • In the default VM argument field, enter "-Xmx1024M". (or your memory priority, for 1 GB of RAM its 1024)
  • Click OK or OK.
+3
source

I think you can determine the heap size in right clicks-> run-> run-configurations according to the arguments of the VM. I tested it on a Mac and it works.

+2
source

All Articles