Giter Site home page Giter Site logo

peteanderson80 / coco-caption Goto Github PK

View Code? Open in Web Editor NEW
50.0 7.0 42.0 123.94 MB

Adds SPICE metric to coco-caption evaluation server codes

Home Page: http://panderson.me/spice

License: Other

Jupyter Notebook 93.34% Python 6.59% Shell 0.07%
mscoco mscoco-image-dataset mscoco-dataset image-captioning captioning-images spice

coco-caption's Introduction

Microsoft COCO Caption Evaluation

Evaluation codes for MS COCO caption generation.

No longer maintained. The SPICE metric has been incorporated into the official COCO caption evaluation code, so this repo is no longer maintained.

Requirements

  • java 1.8.0
  • python 2.7

Files

./

  • cocoEvalCapDemo.py (demo script)

./annotation

  • captions_val2014.json (MS COCO 2014 caption validation set)
  • Visit MS COCO download page for more details.

./results

  • captions_val2014_fakecap_results.json (an example of fake results for running demo)
  • Visit MS COCO format page for more details.

./pycocoevalcap: The folder where all evaluation codes are stored.

  • evals.py: The file includes COCOEavlCap class that can be used to evaluate results on COCO.
  • tokenizer: Python wrapper of Stanford CoreNLP PTBTokenizer
  • bleu: Bleu evalutation codes
  • meteor: Meteor evaluation codes
  • rouge: Rouge-L evaluation codes
  • cider: CIDEr evaluation codes
  • spice: SPICE evaluation codes

Setup

  • You will first need to download the Stanford CoreNLP 3.6.0 code and models for use by SPICE. To do this, run: ./get_stanford_models.sh

References

Developers

  • Xinlei Chen (CMU)
  • Hao Fang (University of Washington)
  • Tsung-Yi Lin (Cornell)
  • Ramakrishna Vedantam (Virgina Tech)

Acknowledgement

  • David Chiang (University of Norte Dame)
  • Michael Denkowski (CMU)
  • Alexander Rush (Harvard University)

coco-caption's People

Contributors

endernewton avatar hao-fang avatar peteanderson80 avatar ramakrishnavedantam928 avatar tylin avatar vrama91 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

coco-caption's Issues

How to contribute

Hi,
Thank you very much for the evaluation script. For my applications, I needed a Python 3.5 version of this script so I converted this code to be compatible with Python 3.5 and was wondering whether I can create a pull request to include it either in this repo or somewhere else.

Thanks again

subprocess.CalledProcessError: when calculating SPICE score

Hi, whenever i run spice.Compute_Score() locally i get this error:

CalledProcessError: Command '['java', '-jar', '-Xmx8G', 'spice-1.0.jar', '/Users/peter/opt/anaconda3/lib/python3.8/site-packages/pycocoevalcap/spice/tmp/tmpbq7g34li', '-cache', '/Users/peter/opt/anaconda3/lib/python3.8/site-packages/pycocoevalcap/spice/cache', '-out', '/Users/peter/opt/anaconda3/lib/python3.8/site-packages/pycocoevalcap/spice/tmp/tmpx6ykjgr7', '-subset', '-silent']' returned non-zero exit status 1.

However it seems to work on google colab? Could you kindly help me out?

SPICE implementation on Google Colab

Hi!
I am having issues implementing SPICE.Please help

Error: Could not score batched file input:
java.lang.ClassCastException: org.json.simple.JSONArray cannot be cast to java.lang.String
at edu.anu.spice.SpiceScorer.scoreBatch(SpiceScorer.java:95)
at edu.anu.spice.SpiceScorer.main(SpiceScorer.java:60)
Traceback (most recent call last):
File "/content/pycocoevalcap/spice/model_evaluation.py", line 238, in
Val_with_MLE()
File "/content/pycocoevalcap/spice/model_evaluation.py", line 236, in Val_with_MLE
spice_class.compute_score(all_usable_ref,all_gen_sents)
File "/content/pycocoevalcap/spice/test.py", line 78, in compute_score
cwd=os.path.dirname(os.path.abspath(file)))
File "/usr/lib/python3.6/subprocess.py", line 311, in check_call
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['java', '-jar', '-Xmx8G', 'spice-1.0.jar', '/content/pycocoevalcap/spice/tmp/tmpr49ny2ll', '-cache', '/content/pycocoevalcap/spice/cache', '-out', '/content/pycocoevalcap/spice/tmp/tmpl8edcdti', '-subset', '-silent']' returned non-zero exit status 1.

Problem with SPICE

Hi,

I was trying to report the SPICE for our paper; however, we faced multiple problems to make it work. I exactly followed the provided instructions to incorporate the spice into our evaluation procedure.
Anyway, I got the following errors:

Error: Could not score batched file input:
org.fusesource.lmdbjni.LMDBException: MDB_CURSOR_FULL: Internal error - cursor stack limit reached
at org.fusesource.lmdbjni.Util.checkErrorCode(Util.java:44)
at org.fusesource.lmdbjni.Database.get(Database.java:202)
at org.fusesource.lmdbjni.Database.get(Database.java:193)
at org.fusesource.lmdbjni.Database.get(Database.java:186)
at org.fusesource.lmdbjni.Database.get(Database.java:161)
at edu.anu.spice.LmdbTupleDB.getTransaction(LmdbTupleDB.java:96)
at edu.anu.spice.SpiceParser.loadTuplesFromDB(SpiceParser.java:195)
at edu.anu.spice.SpiceParser.loadTuples(SpiceParser.java:245)
at edu.anu.spice.SpiceParser.parseCaptions(SpiceParser.java:251)
at edu.anu.spice.SpiceScorer.scoreBatch(SpiceScorer.java:109)
at edu.anu.spice.SpiceScorer.main(SpiceScorer.java:60)

I was wondering have you ever faced with the problem like above? [This happens once a while] It is worth noting that our system works with MSCOCO evolution server with no problems. I'd be appreciated if you please let me know your comment.

Rasool

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.