Compare commits
32 Commits
Author | SHA1 | Date | |
---|---|---|---|
|
6fe68e3c40 | ||
|
82496dc8e4 | ||
|
f74ee668b6 | ||
|
45838e7e98 | ||
|
ff002c818a | ||
|
79d6b6f67f | ||
|
b8993f7d64 | ||
|
bd2e0f0d5c | ||
|
7de5b41340 | ||
|
a63e9209fd | ||
|
b0eb2e0628 | ||
|
60415f8217 | ||
|
724eac38d8 | ||
|
8fa372de15 | ||
|
a1ffe04a30 | ||
|
74b0cf868e | ||
|
50e8e2730b | ||
|
b484b453e0 | ||
|
7c2e0ddec7 | ||
|
384aba4654 | ||
|
a857dd3042 | ||
|
b1b672f66d | ||
|
09d9143a82 | ||
|
c1a6b57ac5 | ||
|
6b78b7ccc7 | ||
|
f0b1cfcba6 | ||
|
4bcd046016 | ||
|
ae09f609c2 | ||
|
d1006bbc92 | ||
|
d58137e8f9 | ||
|
79c83e34a3 | ||
|
37a098109f |
5
.gitignore
vendored
Normal file
@@ -0,0 +1,5 @@
|
|||||||
|
*.pyc
|
||||||
|
.*
|
||||||
|
*egg-info
|
||||||
|
dist
|
||||||
|
README.html
|
6
.travis.yml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
language: python
|
||||||
|
python:
|
||||||
|
- "2.7"
|
||||||
|
install: "pip install -r requirements.txt"
|
||||||
|
# run nosetests - Tests
|
||||||
|
script: nosetests
|
3
Dockerfile
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
from python:2.7-onbuild
|
||||||
|
|
||||||
|
ENTRYPOINT ["python", "-m", "senpy"]
|
@@ -1,4 +1,5 @@
|
|||||||
include requirements.txt
|
include requirements.txt
|
||||||
|
include test-requirements.txt
|
||||||
include README.md
|
include README.md
|
||||||
include senpy/context.jsonld
|
include senpy/context.jsonld
|
||||||
recursive-include *.senpy
|
graft senpy/plugins
|
||||||
|
19
README.md
@@ -1,19 +0,0 @@
|
|||||||

|
|
||||||
[Senpy](http://senpy.herokuapp.com)
|
|
||||||
=========================================
|
|
||||||
Example endpoint that yields results compatible with the EUROSENTIMENT format and exposes the NIF API.
|
|
||||||
It can be used as a template to adapt existing services to EUROSENTIMENT or to create new services.
|
|
||||||
|
|
||||||
[DEMO on Heroku](http://eurosentiment-endpoint.herokuapp.com)
|
|
||||||
|
|
||||||
This endpoint serves as bootcampt for any developer wishing to build applications that use the EUROSENTIMENT services.
|
|
||||||
|
|
||||||
Acknowledgement
|
|
||||||
---------------
|
|
||||||
EUROSENTIMENT PROJECT
|
|
||||||
Grant Agreement no: 296277
|
|
||||||
Starting date: 01/09/2012
|
|
||||||
Project duration: 24 months
|
|
||||||
|
|
||||||

|
|
||||||

|
|
91
README.rst
Normal file
@@ -0,0 +1,91 @@
|
|||||||
|
.. image:: img/header.png
|
||||||
|
:height: 6em
|
||||||
|
:target: http://demos.gsi.dit.upm.es/senpy
|
||||||
|
|
||||||
|
.. image:: https://travis-ci.org/gsi-upm/senpy.svg?branch=master
|
||||||
|
:target: https://travis-ci.org/gsi-upm/senpy
|
||||||
|
|
||||||
|
Senpy lets you create sentiment analysis web services easily, fast and using a well known API.
|
||||||
|
As a bonus, senpy services use semantic vocabularies (e.g. `NIF <http://persistence.uni-leipzig.org/nlp2rdf/>`_, `Marl <http://www.gsi.dit.upm.es/ontologies/marl>`_, `Onyx <http://www.gsi.dit.upm.es/ontologies/onyx>`_) and formats (turtle, JSON-LD, xml-rdf).
|
||||||
|
|
||||||
|
Have you ever wanted to turn your sentiment analysis algorithms into a service?
|
||||||
|
With senpy, now you can.
|
||||||
|
It provides all the tools so you just have to worry about improving your algorithms:
|
||||||
|
|
||||||
|
`See it in action. <http://demos.gsi.dit.upm.es/senpy>`_
|
||||||
|
|
||||||
|
Installation
|
||||||
|
------------
|
||||||
|
The stable version can be installed in three ways.
|
||||||
|
|
||||||
|
Through PIP
|
||||||
|
***********
|
||||||
|
|
||||||
|
.. code:: bash
|
||||||
|
|
||||||
|
pip install --user senpy
|
||||||
|
|
||||||
|
|
||||||
|
Alternatively, you can use the development version:
|
||||||
|
|
||||||
|
.. code:: bash
|
||||||
|
|
||||||
|
git clone git@github.com:gsi-upm/senpy
|
||||||
|
cd senpy
|
||||||
|
pip install --user .
|
||||||
|
|
||||||
|
If you want to install senpy globally, use sudo instead of the ``--user`` flag.
|
||||||
|
|
||||||
|
Docker Image
|
||||||
|
************
|
||||||
|
Build the image or use the pre-built one: ``docker run -ti -p 5000:5000 balkian/senpy --host 0.0.0.0 --default-plugins``.
|
||||||
|
|
||||||
|
To add custom plugins, add a volume and tell senpy where to find the plugins: ``docker run -ti -p 5000:5000 -v <PATH OF PLUGINS>:/plugins balkian/senpy --host 0.0.0.0 --default-plugins -f /plugins``
|
||||||
|
|
||||||
|
Usage
|
||||||
|
-----
|
||||||
|
|
||||||
|
However, the easiest and recommended way is to just use the command-line tool to load your plugins and launch the server.
|
||||||
|
|
||||||
|
.. code:: bash
|
||||||
|
|
||||||
|
senpy
|
||||||
|
|
||||||
|
or, alternatively:
|
||||||
|
|
||||||
|
.. code:: bash
|
||||||
|
|
||||||
|
python -m senpy
|
||||||
|
|
||||||
|
|
||||||
|
This will create a server with any modules found in the current path.
|
||||||
|
For more options, see the `--help` page.
|
||||||
|
|
||||||
|
Alternatively, you can use the modules included in senpy to build your own application.
|
||||||
|
|
||||||
|
Deploying on Heroku
|
||||||
|
-------------------
|
||||||
|
Use a free heroku instance to share your service with the world.
|
||||||
|
Just use the example Procfile in this repository, or build your own.
|
||||||
|
|
||||||
|
|
||||||
|
`DEMO on heroku <http://senpy.herokuapp.com>`_
|
||||||
|
|
||||||
|
|
||||||
|
For more information, check out the `documentation <http://senpy.readthedocs.org>`_.
|
||||||
|
------------------------------------------------------------------------------------
|
||||||
|
|
||||||
|
|
||||||
|
Acknowledgement
|
||||||
|
---------------
|
||||||
|
This development has been partially funded by the European Union through the MixedEmotions Project (project number H2020 655632), as part of the `RIA ICT 15 Big data and Open Data Innovation and take-up` programme.
|
||||||
|
|
||||||
|
|
||||||
|
.. image:: img/me.png
|
||||||
|
:target: http://mixedemotions-project.eu
|
||||||
|
:height: 100px
|
||||||
|
:alt: MixedEmotions Logo
|
||||||
|
|
||||||
|
.. image:: img/eu-flag.jpg
|
||||||
|
:height: 100px
|
||||||
|
:target: http://ec.europa.eu/research/participants/portal/desktop/en/opportunities/index.html
|
10
app.py
@@ -15,9 +15,9 @@
|
|||||||
# See the License for the specific language governing permissions and
|
# See the License for the specific language governing permissions and
|
||||||
# limitations under the License.
|
# limitations under the License.
|
||||||
"""
|
"""
|
||||||
Simple Sentiment Analysis server for EUROSENTIMENT
|
This is a helper for development. If you want to run Senpy use:
|
||||||
|
|
||||||
This class shows how to use the nif_server module to create custom services.
|
python -m senpy
|
||||||
"""
|
"""
|
||||||
from gevent.monkey import patch_all; patch_all()
|
from gevent.monkey import patch_all; patch_all()
|
||||||
import gevent
|
import gevent
|
||||||
@@ -32,12 +32,12 @@ logging.basicConfig(level=logging.DEBUG)
|
|||||||
|
|
||||||
app = Flask(__name__)
|
app = Flask(__name__)
|
||||||
mypath = os.path.dirname(os.path.realpath(__file__))
|
mypath = os.path.dirname(os.path.realpath(__file__))
|
||||||
sp = Senpy(app, os.path.join(mypath, "plugins"))
|
sp = Senpy(app, os.path.join(mypath, "plugins"), default_plugins=True)
|
||||||
sp.activate_all()
|
sp.activate_all()
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
import logging
|
import logging
|
||||||
logging.basicConfig(level=config.DEBUG)
|
logging.basicConfig(level=config.DEBUG)
|
||||||
app.debug = config.DEBUG
|
app.debug = config.DEBUG
|
||||||
http_server = WSGIServer(('', 5000), app)
|
http_server = WSGIServer(('', config.SERVER_PORT), app)
|
||||||
http_server.serve_forever()
|
http_server.serve_forever()
|
||||||
|
1
docs/.gitignore
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
_build
|
177
docs/Makefile
Normal file
@@ -0,0 +1,177 @@
|
|||||||
|
# Makefile for Sphinx documentation
|
||||||
|
#
|
||||||
|
|
||||||
|
# You can set these variables from the command line.
|
||||||
|
SPHINXOPTS =
|
||||||
|
SPHINXBUILD = sphinx-build
|
||||||
|
PAPER =
|
||||||
|
BUILDDIR = _build
|
||||||
|
|
||||||
|
# User-friendly check for sphinx-build
|
||||||
|
ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1)
|
||||||
|
$(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/)
|
||||||
|
endif
|
||||||
|
|
||||||
|
# Internal variables.
|
||||||
|
PAPEROPT_a4 = -D latex_paper_size=a4
|
||||||
|
PAPEROPT_letter = -D latex_paper_size=letter
|
||||||
|
ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
|
||||||
|
# the i18n builder cannot share the environment and doctrees with the others
|
||||||
|
I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
|
||||||
|
|
||||||
|
.PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest gettext
|
||||||
|
|
||||||
|
help:
|
||||||
|
@echo "Please use \`make <target>' where <target> is one of"
|
||||||
|
@echo " html to make standalone HTML files"
|
||||||
|
@echo " dirhtml to make HTML files named index.html in directories"
|
||||||
|
@echo " singlehtml to make a single large HTML file"
|
||||||
|
@echo " pickle to make pickle files"
|
||||||
|
@echo " json to make JSON files"
|
||||||
|
@echo " htmlhelp to make HTML files and a HTML help project"
|
||||||
|
@echo " qthelp to make HTML files and a qthelp project"
|
||||||
|
@echo " devhelp to make HTML files and a Devhelp project"
|
||||||
|
@echo " epub to make an epub"
|
||||||
|
@echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
|
||||||
|
@echo " latexpdf to make LaTeX files and run them through pdflatex"
|
||||||
|
@echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx"
|
||||||
|
@echo " text to make text files"
|
||||||
|
@echo " man to make manual pages"
|
||||||
|
@echo " texinfo to make Texinfo files"
|
||||||
|
@echo " info to make Texinfo files and run them through makeinfo"
|
||||||
|
@echo " gettext to make PO message catalogs"
|
||||||
|
@echo " changes to make an overview of all changed/added/deprecated items"
|
||||||
|
@echo " xml to make Docutils-native XML files"
|
||||||
|
@echo " pseudoxml to make pseudoxml-XML files for display purposes"
|
||||||
|
@echo " linkcheck to check all external links for integrity"
|
||||||
|
@echo " doctest to run all doctests embedded in the documentation (if enabled)"
|
||||||
|
|
||||||
|
clean:
|
||||||
|
rm -rf $(BUILDDIR)/*
|
||||||
|
|
||||||
|
html:
|
||||||
|
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
|
||||||
|
@echo
|
||||||
|
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
|
||||||
|
|
||||||
|
dirhtml:
|
||||||
|
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
|
||||||
|
@echo
|
||||||
|
@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
|
||||||
|
|
||||||
|
singlehtml:
|
||||||
|
$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
|
||||||
|
@echo
|
||||||
|
@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."
|
||||||
|
|
||||||
|
pickle:
|
||||||
|
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
|
||||||
|
@echo
|
||||||
|
@echo "Build finished; now you can process the pickle files."
|
||||||
|
|
||||||
|
json:
|
||||||
|
$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
|
||||||
|
@echo
|
||||||
|
@echo "Build finished; now you can process the JSON files."
|
||||||
|
|
||||||
|
htmlhelp:
|
||||||
|
$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
|
||||||
|
@echo
|
||||||
|
@echo "Build finished; now you can run HTML Help Workshop with the" \
|
||||||
|
".hhp project file in $(BUILDDIR)/htmlhelp."
|
||||||
|
|
||||||
|
qthelp:
|
||||||
|
$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
|
||||||
|
@echo
|
||||||
|
@echo "Build finished; now you can run "qcollectiongenerator" with the" \
|
||||||
|
".qhcp project file in $(BUILDDIR)/qthelp, like this:"
|
||||||
|
@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/Senpy.qhcp"
|
||||||
|
@echo "To view the help file:"
|
||||||
|
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/Senpy.qhc"
|
||||||
|
|
||||||
|
devhelp:
|
||||||
|
$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
|
||||||
|
@echo
|
||||||
|
@echo "Build finished."
|
||||||
|
@echo "To view the help file:"
|
||||||
|
@echo "# mkdir -p $$HOME/.local/share/devhelp/Senpy"
|
||||||
|
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/Senpy"
|
||||||
|
@echo "# devhelp"
|
||||||
|
|
||||||
|
epub:
|
||||||
|
$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
|
||||||
|
@echo
|
||||||
|
@echo "Build finished. The epub file is in $(BUILDDIR)/epub."
|
||||||
|
|
||||||
|
latex:
|
||||||
|
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
|
||||||
|
@echo
|
||||||
|
@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
|
||||||
|
@echo "Run \`make' in that directory to run these through (pdf)latex" \
|
||||||
|
"(use \`make latexpdf' here to do that automatically)."
|
||||||
|
|
||||||
|
latexpdf:
|
||||||
|
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
|
||||||
|
@echo "Running LaTeX files through pdflatex..."
|
||||||
|
$(MAKE) -C $(BUILDDIR)/latex all-pdf
|
||||||
|
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
|
||||||
|
|
||||||
|
latexpdfja:
|
||||||
|
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
|
||||||
|
@echo "Running LaTeX files through platex and dvipdfmx..."
|
||||||
|
$(MAKE) -C $(BUILDDIR)/latex all-pdf-ja
|
||||||
|
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
|
||||||
|
|
||||||
|
text:
|
||||||
|
$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
|
||||||
|
@echo
|
||||||
|
@echo "Build finished. The text files are in $(BUILDDIR)/text."
|
||||||
|
|
||||||
|
man:
|
||||||
|
$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
|
||||||
|
@echo
|
||||||
|
@echo "Build finished. The manual pages are in $(BUILDDIR)/man."
|
||||||
|
|
||||||
|
texinfo:
|
||||||
|
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
|
||||||
|
@echo
|
||||||
|
@echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo."
|
||||||
|
@echo "Run \`make' in that directory to run these through makeinfo" \
|
||||||
|
"(use \`make info' here to do that automatically)."
|
||||||
|
|
||||||
|
info:
|
||||||
|
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
|
||||||
|
@echo "Running Texinfo files through makeinfo..."
|
||||||
|
make -C $(BUILDDIR)/texinfo info
|
||||||
|
@echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo."
|
||||||
|
|
||||||
|
gettext:
|
||||||
|
$(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale
|
||||||
|
@echo
|
||||||
|
@echo "Build finished. The message catalogs are in $(BUILDDIR)/locale."
|
||||||
|
|
||||||
|
changes:
|
||||||
|
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
|
||||||
|
@echo
|
||||||
|
@echo "The overview file is in $(BUILDDIR)/changes."
|
||||||
|
|
||||||
|
linkcheck:
|
||||||
|
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
|
||||||
|
@echo
|
||||||
|
@echo "Link check complete; look for any errors in the above output " \
|
||||||
|
"or in $(BUILDDIR)/linkcheck/output.txt."
|
||||||
|
|
||||||
|
doctest:
|
||||||
|
$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
|
||||||
|
@echo "Testing of doctests in the sources finished, look at the " \
|
||||||
|
"results in $(BUILDDIR)/doctest/output.txt."
|
||||||
|
|
||||||
|
xml:
|
||||||
|
$(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml
|
||||||
|
@echo
|
||||||
|
@echo "Build finished. The XML files are in $(BUILDDIR)/xml."
|
||||||
|
|
||||||
|
pseudoxml:
|
||||||
|
$(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml
|
||||||
|
@echo
|
||||||
|
@echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml."
|
208
docs/api.rst
Normal file
@@ -0,0 +1,208 @@
|
|||||||
|
NIF API
|
||||||
|
=======
|
||||||
|
.. http:get:: /api
|
||||||
|
|
||||||
|
Basic endpoint for sentiment/emotion analysis.
|
||||||
|
|
||||||
|
**Example request**:
|
||||||
|
|
||||||
|
.. sourcecode:: http
|
||||||
|
|
||||||
|
GET /api?input=I%20love%20GSI HTTP/1.1
|
||||||
|
Host: localhost
|
||||||
|
Accept: application/json, text/javascript
|
||||||
|
|
||||||
|
|
||||||
|
**Example response**:
|
||||||
|
|
||||||
|
.. sourcecode:: http
|
||||||
|
|
||||||
|
HTTP/1.1 200 OK
|
||||||
|
Vary: Accept
|
||||||
|
Content-Type: text/javascript
|
||||||
|
|
||||||
|
{
|
||||||
|
"@context": [
|
||||||
|
"http://127.0.0.1/static/context.jsonld",
|
||||||
|
],
|
||||||
|
"analysis": [
|
||||||
|
{
|
||||||
|
"@id": "SentimentAnalysisExample",
|
||||||
|
"@type": "marl:SentimentAnalysis",
|
||||||
|
"dc:language": "en",
|
||||||
|
"marl:maxPolarityValue": 10.0,
|
||||||
|
"marl:minPolarityValue": 0.0
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"domain": "wndomains:electronics",
|
||||||
|
"entries": [
|
||||||
|
{
|
||||||
|
"opinions": [
|
||||||
|
{
|
||||||
|
"prov:generatedBy": "SentimentAnalysisExample",
|
||||||
|
"marl:polarityValue": 7.8,
|
||||||
|
"marl:hasPolarity": "marl:Positive",
|
||||||
|
"marl:describesObject": "http://www.gsi.dit.upm.es",
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"nif:isString": "I love GSI",
|
||||||
|
"strings": [
|
||||||
|
{
|
||||||
|
"nif:anchorOf": "GSI",
|
||||||
|
"nif:taIdentRef": "http://www.gsi.dit.upm.es"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
||||||
|
:query i input: No default. Depends on informat and intype
|
||||||
|
:query f informat: one of `turtle` (default), `text`, `json-ld`
|
||||||
|
:query t intype: one of `direct` (default), `url`
|
||||||
|
:query o outformat: one of `turtle` (default), `text`, `json-ld`
|
||||||
|
:query p prefix: prefix for the URIs
|
||||||
|
:query algo algorithm: algorithm/plugin to use for the analysis. For a list of options, see :http:get:`/api/plugins`. If not provided, the default plugin will be used (:http:get:`/api/plugins/default`).
|
||||||
|
|
||||||
|
:reqheader Accept: the response content type depends on
|
||||||
|
:mailheader:`Accept` header
|
||||||
|
:resheader Content-Type: this depends on :mailheader:`Accept`
|
||||||
|
header of request
|
||||||
|
:statuscode 200: no error
|
||||||
|
:statuscode 404: service not found
|
||||||
|
|
||||||
|
.. http:post:: /api
|
||||||
|
|
||||||
|
The same as :http:get:`/api`.
|
||||||
|
|
||||||
|
.. http:get:: /api/plugins
|
||||||
|
|
||||||
|
Returns a list of installed plugins.
|
||||||
|
**Example request**:
|
||||||
|
|
||||||
|
.. sourcecode:: http
|
||||||
|
|
||||||
|
GET /api/plugins HTTP/1.1
|
||||||
|
Host: localhost
|
||||||
|
Accept: application/json, text/javascript
|
||||||
|
|
||||||
|
|
||||||
|
**Example response**:
|
||||||
|
|
||||||
|
.. sourcecode:: http
|
||||||
|
|
||||||
|
{
|
||||||
|
"@context": {
|
||||||
|
...
|
||||||
|
},
|
||||||
|
"sentiment140": {
|
||||||
|
"name": "sentiment140",
|
||||||
|
"is_activated": true,
|
||||||
|
"version": "0.1",
|
||||||
|
"extra_params": {
|
||||||
|
"@id": "extra_params_sentiment140_0.1",
|
||||||
|
"language": {
|
||||||
|
"required": false,
|
||||||
|
"@id": "lang_sentiment140",
|
||||||
|
"options": [
|
||||||
|
"es",
|
||||||
|
"en",
|
||||||
|
"auto"
|
||||||
|
],
|
||||||
|
"aliases": [
|
||||||
|
"language",
|
||||||
|
"l"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"@id": "sentiment140_0.1"
|
||||||
|
},
|
||||||
|
"rand": {
|
||||||
|
"name": "rand",
|
||||||
|
"is_activated": true,
|
||||||
|
"version": "0.1",
|
||||||
|
"extra_params": {
|
||||||
|
"@id": "extra_params_rand_0.1",
|
||||||
|
"language": {
|
||||||
|
"required": false,
|
||||||
|
"@id": "lang_rand",
|
||||||
|
"options": [
|
||||||
|
"es",
|
||||||
|
"en",
|
||||||
|
"auto"
|
||||||
|
],
|
||||||
|
"aliases": [
|
||||||
|
"language",
|
||||||
|
"l"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"@id": "rand_0.1"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
.. http:get:: /api/plugins/<pluginname>
|
||||||
|
|
||||||
|
Returns the information of a specific plugin.
|
||||||
|
**Example request**:
|
||||||
|
|
||||||
|
.. sourcecode:: http
|
||||||
|
|
||||||
|
GET /api/plugins/rand HTTP/1.1
|
||||||
|
Host: localhost
|
||||||
|
Accept: application/json, text/javascript
|
||||||
|
|
||||||
|
|
||||||
|
**Example response**:
|
||||||
|
|
||||||
|
.. sourcecode:: http
|
||||||
|
|
||||||
|
{
|
||||||
|
"@id": "rand_0.1",
|
||||||
|
"extra_params": {
|
||||||
|
"@id": "extra_params_rand_0.1",
|
||||||
|
"language": {
|
||||||
|
"@id": "lang_rand",
|
||||||
|
"aliases": [
|
||||||
|
"language",
|
||||||
|
"l"
|
||||||
|
],
|
||||||
|
"options": [
|
||||||
|
"es",
|
||||||
|
"en",
|
||||||
|
"auto"
|
||||||
|
],
|
||||||
|
"required": false
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"is_activated": true,
|
||||||
|
"name": "rand",
|
||||||
|
"version": "0.1"
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
.. http:get:: /api/plugins/default
|
||||||
|
|
||||||
|
Return the information about the default plugin.
|
||||||
|
|
||||||
|
.. http:get:: /api/plugins/<pluginname>/{de}activate
|
||||||
|
|
||||||
|
{De}activate a plugin.
|
||||||
|
|
||||||
|
**Example request**:
|
||||||
|
|
||||||
|
.. sourcecode:: http
|
||||||
|
|
||||||
|
GET /api/plugins/rand/deactivate HTTP/1.1
|
||||||
|
Host: localhost
|
||||||
|
Accept: application/json, text/javascript
|
||||||
|
|
||||||
|
|
||||||
|
**Example response**:
|
||||||
|
|
||||||
|
.. sourcecode:: http
|
||||||
|
|
||||||
|
{
|
||||||
|
"@context": {},
|
||||||
|
"message": "Ok"
|
||||||
|
}
|
272
docs/conf.py
Normal file
@@ -0,0 +1,272 @@
|
|||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
#
|
||||||
|
# Senpy documentation build configuration file, created by
|
||||||
|
# sphinx-quickstart on Tue Feb 24 08:57:32 2015.
|
||||||
|
#
|
||||||
|
# This file is execfile()d with the current directory set to its
|
||||||
|
# containing dir.
|
||||||
|
#
|
||||||
|
# Note that not all possible configuration values are present in this
|
||||||
|
# autogenerated file.
|
||||||
|
#
|
||||||
|
# All configuration values have a default; values that are commented out
|
||||||
|
# serve to show the default.
|
||||||
|
|
||||||
|
import sys
|
||||||
|
import os
|
||||||
|
|
||||||
|
# If extensions (or modules to document with autodoc) are in another directory,
|
||||||
|
# add these directories to sys.path here. If the directory is relative to the
|
||||||
|
# documentation root, use os.path.abspath to make it absolute, like shown here.
|
||||||
|
#sys.path.insert(0, os.path.abspath('.'))
|
||||||
|
|
||||||
|
# -- General configuration ------------------------------------------------
|
||||||
|
|
||||||
|
on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
|
||||||
|
|
||||||
|
# If your documentation needs a minimal Sphinx version, state it here.
|
||||||
|
#needs_sphinx = '1.0'
|
||||||
|
|
||||||
|
# Add any Sphinx extension module names here, as strings. They can be
|
||||||
|
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
|
||||||
|
# ones.
|
||||||
|
extensions = [
|
||||||
|
'sphinx.ext.autodoc',
|
||||||
|
'sphinx.ext.doctest',
|
||||||
|
'sphinx.ext.todo',
|
||||||
|
'sphinxcontrib.httpdomain',
|
||||||
|
'sphinx.ext.coverage',
|
||||||
|
]
|
||||||
|
|
||||||
|
# Add any paths that contain templates here, relative to this directory.
|
||||||
|
templates_path = ['_templates']
|
||||||
|
|
||||||
|
# The suffix of source filenames.
|
||||||
|
source_suffix = '.rst'
|
||||||
|
|
||||||
|
# The encoding of source files.
|
||||||
|
#source_encoding = 'utf-8-sig'
|
||||||
|
|
||||||
|
# The master toctree document.
|
||||||
|
master_doc = 'index'
|
||||||
|
|
||||||
|
# General information about the project.
|
||||||
|
project = u'Senpy'
|
||||||
|
copyright = u'2015, J. Fernando Sánchez'
|
||||||
|
|
||||||
|
# The version info for the project you're documenting, acts as replacement for
|
||||||
|
# |version| and |release|, also used in various other places throughout the
|
||||||
|
# built documents.
|
||||||
|
#
|
||||||
|
# The short X.Y version.
|
||||||
|
version = '0.4'
|
||||||
|
# The full version, including alpha/beta/rc tags.
|
||||||
|
release = '0.4'
|
||||||
|
|
||||||
|
# The language for content autogenerated by Sphinx. Refer to documentation
|
||||||
|
# for a list of supported languages.
|
||||||
|
#language = None
|
||||||
|
|
||||||
|
# There are two options for replacing |today|: either, you set today to some
|
||||||
|
# non-false value, then it is used:
|
||||||
|
#today = ''
|
||||||
|
# Else, today_fmt is used as the format for a strftime call.
|
||||||
|
#today_fmt = '%B %d, %Y'
|
||||||
|
|
||||||
|
# List of patterns, relative to source directory, that match files and
|
||||||
|
# directories to ignore when looking for source files.
|
||||||
|
exclude_patterns = ['_build']
|
||||||
|
|
||||||
|
# The reST default role (used for this markup: `text`) to use for all
|
||||||
|
# documents.
|
||||||
|
#default_role = None
|
||||||
|
|
||||||
|
# If true, '()' will be appended to :func: etc. cross-reference text.
|
||||||
|
#add_function_parentheses = True
|
||||||
|
|
||||||
|
# If true, the current module name will be prepended to all description
|
||||||
|
# unit titles (such as .. function::).
|
||||||
|
#add_module_names = True
|
||||||
|
|
||||||
|
# If true, sectionauthor and moduleauthor directives will be shown in the
|
||||||
|
# output. They are ignored by default.
|
||||||
|
#show_authors = False
|
||||||
|
|
||||||
|
# The name of the Pygments (syntax highlighting) style to use.
|
||||||
|
pygments_style = 'sphinx'
|
||||||
|
|
||||||
|
# A list of ignored prefixes for module index sorting.
|
||||||
|
#modindex_common_prefix = []
|
||||||
|
|
||||||
|
# If true, keep warnings as "system message" paragraphs in the built documents.
|
||||||
|
#keep_warnings = False
|
||||||
|
|
||||||
|
|
||||||
|
# -- Options for HTML output ----------------------------------------------
|
||||||
|
if not on_rtd: # only import and set the theme if we're building docs locally
|
||||||
|
import sphinx_rtd_theme
|
||||||
|
html_theme = 'sphinx_rtd_theme'
|
||||||
|
html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
|
||||||
|
|
||||||
|
else:
|
||||||
|
html_theme = 'default'
|
||||||
|
|
||||||
|
# The theme to use for HTML and HTML Help pages. See the documentation for
|
||||||
|
# a list of builtin themes.
|
||||||
|
|
||||||
|
# Theme options are theme-specific and customize the look and feel of a theme
|
||||||
|
# further. For a list of options available for each theme, see the
|
||||||
|
# documentation.
|
||||||
|
#html_theme_options = {}
|
||||||
|
|
||||||
|
# Add any paths that contain custom themes here, relative to this directory.
|
||||||
|
#html_theme_path = []
|
||||||
|
|
||||||
|
# The name for this set of Sphinx documents. If None, it defaults to
|
||||||
|
# "<project> v<release> documentation".
|
||||||
|
#html_title = None
|
||||||
|
|
||||||
|
# A shorter title for the navigation bar. Default is the same as html_title.
|
||||||
|
#html_short_title = None
|
||||||
|
|
||||||
|
# The name of an image file (relative to this directory) to place at the top
|
||||||
|
# of the sidebar.
|
||||||
|
#html_logo = None
|
||||||
|
|
||||||
|
# The name of an image file (within the static path) to use as favicon of the
|
||||||
|
# docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32
|
||||||
|
# pixels large.
|
||||||
|
#html_favicon = None
|
||||||
|
|
||||||
|
# Add any paths that contain custom static files (such as style sheets) here,
|
||||||
|
# relative to this directory. They are copied after the builtin static files,
|
||||||
|
# so a file named "default.css" will overwrite the builtin "default.css".
|
||||||
|
html_static_path = ['_static']
|
||||||
|
|
||||||
|
# Add any extra paths that contain custom files (such as robots.txt or
|
||||||
|
# .htaccess) here, relative to this directory. These files are copied
|
||||||
|
# directly to the root of the documentation.
|
||||||
|
#html_extra_path = []
|
||||||
|
|
||||||
|
# If not '', a 'Last updated on:' timestamp is inserted at every page bottom,
|
||||||
|
# using the given strftime format.
|
||||||
|
#html_last_updated_fmt = '%b %d, %Y'
|
||||||
|
|
||||||
|
# If true, SmartyPants will be used to convert quotes and dashes to
|
||||||
|
# typographically correct entities.
|
||||||
|
#html_use_smartypants = True
|
||||||
|
|
||||||
|
# Custom sidebar templates, maps document names to template names.
|
||||||
|
#html_sidebars = {}
|
||||||
|
|
||||||
|
# Additional templates that should be rendered to pages, maps page names to
|
||||||
|
# template names.
|
||||||
|
#html_additional_pages = {}
|
||||||
|
|
||||||
|
# If false, no module index is generated.
|
||||||
|
#html_domain_indices = True
|
||||||
|
|
||||||
|
# If false, no index is generated.
|
||||||
|
#html_use_index = True
|
||||||
|
|
||||||
|
# If true, the index is split into individual pages for each letter.
|
||||||
|
#html_split_index = False
|
||||||
|
|
||||||
|
# If true, links to the reST sources are added to the pages.
|
||||||
|
#html_show_sourcelink = True
|
||||||
|
|
||||||
|
# If true, "Created using Sphinx" is shown in the HTML footer. Default is True.
|
||||||
|
#html_show_sphinx = True
|
||||||
|
|
||||||
|
# If true, "(C) Copyright ..." is shown in the HTML footer. Default is True.
|
||||||
|
#html_show_copyright = True
|
||||||
|
|
||||||
|
# If true, an OpenSearch description file will be output, and all pages will
|
||||||
|
# contain a <link> tag referring to it. The value of this option must be the
|
||||||
|
# base URL from which the finished HTML is served.
|
||||||
|
#html_use_opensearch = ''
|
||||||
|
|
||||||
|
# This is the file name suffix for HTML files (e.g. ".xhtml").
|
||||||
|
#html_file_suffix = None
|
||||||
|
|
||||||
|
# Output file base name for HTML help builder.
|
||||||
|
htmlhelp_basename = 'Senpydoc'
|
||||||
|
|
||||||
|
|
||||||
|
# -- Options for LaTeX output ---------------------------------------------
|
||||||
|
|
||||||
|
latex_elements = {
|
||||||
|
# The paper size ('letterpaper' or 'a4paper').
|
||||||
|
#'papersize': 'letterpaper',
|
||||||
|
|
||||||
|
# The font size ('10pt', '11pt' or '12pt').
|
||||||
|
#'pointsize': '10pt',
|
||||||
|
|
||||||
|
# Additional stuff for the LaTeX preamble.
|
||||||
|
#'preamble': '',
|
||||||
|
}
|
||||||
|
|
||||||
|
# Grouping the document tree into LaTeX files. List of tuples
|
||||||
|
# (source start file, target name, title,
|
||||||
|
# author, documentclass [howto, manual, or own class]).
|
||||||
|
latex_documents = [
|
||||||
|
('index', 'Senpy.tex', u'Senpy Documentation',
|
||||||
|
u'J. Fernando Sánchez', 'manual'),
|
||||||
|
]
|
||||||
|
|
||||||
|
# The name of an image file (relative to this directory) to place at the top of
|
||||||
|
# the title page.
|
||||||
|
#latex_logo = None
|
||||||
|
|
||||||
|
# For "manual" documents, if this is true, then toplevel headings are parts,
|
||||||
|
# not chapters.
|
||||||
|
#latex_use_parts = False
|
||||||
|
|
||||||
|
# If true, show page references after internal links.
|
||||||
|
#latex_show_pagerefs = False
|
||||||
|
|
||||||
|
# If true, show URL addresses after external links.
|
||||||
|
#latex_show_urls = False
|
||||||
|
|
||||||
|
# Documents to append as an appendix to all manuals.
|
||||||
|
#latex_appendices = []
|
||||||
|
|
||||||
|
# If false, no module index is generated.
|
||||||
|
#latex_domain_indices = True
|
||||||
|
|
||||||
|
|
||||||
|
# -- Options for manual page output ---------------------------------------
|
||||||
|
|
||||||
|
# One entry per manual page. List of tuples
|
||||||
|
# (source start file, name, description, authors, manual section).
|
||||||
|
man_pages = [
|
||||||
|
('index', 'senpy', u'Senpy Documentation',
|
||||||
|
[u'J. Fernando Sánchez'], 1)
|
||||||
|
]
|
||||||
|
|
||||||
|
# If true, show URL addresses after external links.
|
||||||
|
#man_show_urls = False
|
||||||
|
|
||||||
|
|
||||||
|
# -- Options for Texinfo output -------------------------------------------
|
||||||
|
|
||||||
|
# Grouping the document tree into Texinfo files. List of tuples
|
||||||
|
# (source start file, target name, title, author,
|
||||||
|
# dir menu entry, description, category)
|
||||||
|
texinfo_documents = [
|
||||||
|
('index', 'Senpy', u'Senpy Documentation',
|
||||||
|
u'J. Fernando Sánchez', 'Senpy', 'One line description of project.',
|
||||||
|
'Miscellaneous'),
|
||||||
|
]
|
||||||
|
|
||||||
|
# Documents to append as an appendix to all manuals.
|
||||||
|
#texinfo_appendices = []
|
||||||
|
|
||||||
|
# If false, no module index is generated.
|
||||||
|
#texinfo_domain_indices = True
|
||||||
|
|
||||||
|
# How to display URL addresses: 'footnote', 'no', or 'inline'.
|
||||||
|
#texinfo_show_urls = 'footnote'
|
||||||
|
|
||||||
|
# If true, do not generate a @detailmenu in the "Top" node's menu.
|
||||||
|
#texinfo_no_detailmenu = False
|
16
docs/index.rst
Normal file
@@ -0,0 +1,16 @@
|
|||||||
|
.. Senpy documentation master file, created by
|
||||||
|
sphinx-quickstart on Tue Feb 24 08:57:32 2015.
|
||||||
|
You can adapt this file completely to your liking, but it should at least
|
||||||
|
contain the root `toctree` directive.
|
||||||
|
|
||||||
|
Welcome to Senpy's documentation!
|
||||||
|
=================================
|
||||||
|
|
||||||
|
Contents:
|
||||||
|
|
||||||
|
.. toctree::
|
||||||
|
installation
|
||||||
|
usage
|
||||||
|
api
|
||||||
|
plugins
|
||||||
|
:maxdepth: 2
|
27
docs/installation.rst
Normal file
@@ -0,0 +1,27 @@
|
|||||||
|
Installation
|
||||||
|
------------
|
||||||
|
The stable version can be installed in three ways.
|
||||||
|
|
||||||
|
Through PIP
|
||||||
|
***********
|
||||||
|
|
||||||
|
.. code:: bash
|
||||||
|
|
||||||
|
pip install --user senpy
|
||||||
|
|
||||||
|
|
||||||
|
Alternatively, you can use the development version:
|
||||||
|
|
||||||
|
.. code:: bash
|
||||||
|
|
||||||
|
git clone git@github.com:gsi-upm/senpy
|
||||||
|
cd senpy
|
||||||
|
pip install --user .
|
||||||
|
|
||||||
|
If you want to install senpy globally, use sudo instead of the ``--user`` flag.
|
||||||
|
|
||||||
|
Docker Image
|
||||||
|
************
|
||||||
|
Build the image or use the pre-built one: ``docker run -ti -p 5000:5000 balkian/senpy --host 0.0.0.0 --default-plugins``.
|
||||||
|
|
||||||
|
To add custom plugins, add a volume and tell senpy where to find the plugins: ``docker run -ti -p 5000:5000 -v <PATH OF PLUGINS>:/plugins balkian/senpy --host 0.0.0.0 --default-plugins -f /plugins``
|
48
docs/plugins.rst
Normal file
@@ -0,0 +1,48 @@
|
|||||||
|
Developing new plugins
|
||||||
|
----------------------
|
||||||
|
|
||||||
|
Plugins Interface
|
||||||
|
=================
|
||||||
|
|
||||||
|
The basic methods in a plugin are:
|
||||||
|
|
||||||
|
* __init__
|
||||||
|
* activate: used to load memory-hungry resources
|
||||||
|
* deactivate: used to free up resources
|
||||||
|
|
||||||
|
Plugins are loaded asynchronously, so don't worry if the activate method takes too long. The plugin will be marked as activated once it is finished executing the method.
|
||||||
|
|
||||||
|
F.A.Q.
|
||||||
|
======
|
||||||
|
If I'm using a classifier, where should I train it?
|
||||||
|
???????????????????????????????????????????????????
|
||||||
|
|
||||||
|
Training a classifier can be time time consuming. To avoid running the training unnecessarily, you can use ShelfMixin to store the classifier. For instance:
|
||||||
|
|
||||||
|
.. code:: python
|
||||||
|
|
||||||
|
from senpy.plugins import ShelfMixin, SenpyPlugin
|
||||||
|
|
||||||
|
class MyPlugin(ShelfMixin, SenpyPlugin):
|
||||||
|
def train(self):
|
||||||
|
''' Code to train the classifier
|
||||||
|
'''
|
||||||
|
# Here goes the code
|
||||||
|
# ...
|
||||||
|
return classifier
|
||||||
|
|
||||||
|
def activate(self):
|
||||||
|
if 'classifier' not in self.sh:
|
||||||
|
classifier = self.train()
|
||||||
|
self.sh['classifier'] = classifier
|
||||||
|
self.classifier = self.sh['classifier']
|
||||||
|
|
||||||
|
def deactivate(self):
|
||||||
|
self.close()
|
||||||
|
|
||||||
|
You can speficy a 'shelf_file' in your .senpy file. By default the ShelfMixin creates a file based on the plugin name and stores it in that plugin's folder.
|
||||||
|
|
||||||
|
Where can I find more code examples?
|
||||||
|
????????????????????????????????????
|
||||||
|
|
||||||
|
See: `<http://github.com/gsi-upm/senpy-plugins-community>`_.
|
1
docs/requirements.txt
Normal file
@@ -0,0 +1 @@
|
|||||||
|
sphinxcontrib-httpdomain>=1.4
|
20
docs/usage.rst
Normal file
@@ -0,0 +1,20 @@
|
|||||||
|
Usage
|
||||||
|
-----
|
||||||
|
|
||||||
|
The easiest and recommended way is to just use the command-line tool to load your plugins and launch the server.
|
||||||
|
|
||||||
|
.. code:: bash
|
||||||
|
|
||||||
|
senpy
|
||||||
|
|
||||||
|
Or, alternatively:
|
||||||
|
|
||||||
|
.. code:: bash
|
||||||
|
|
||||||
|
python -m senpy
|
||||||
|
|
||||||
|
|
||||||
|
This will create a server with any modules found in the current path.
|
||||||
|
For more options, see the `--help` page.
|
||||||
|
|
||||||
|
Alternatively, you can use the modules included in senpy to build your own application.
|
BIN
img/eu-flag.jpg
Normal file
After Width: | Height: | Size: 5.6 KiB |
828
img/final-logo.svg
Normal file
After Width: | Height: | Size: 81 KiB |
BIN
img/gsi.png
Normal file
After Width: | Height: | Size: 5.8 KiB |
BIN
img/header.png
Normal file
After Width: | Height: | Size: 208 KiB |
Before Width: | Height: | Size: 8.0 KiB After Width: | Height: | Size: 8.0 KiB |
2728
img/logo.svg
Normal file
After Width: | Height: | Size: 180 KiB |
BIN
img/logo_grande.png
Normal file
After Width: | Height: | Size: 42 KiB |
BIN
img/me.png
Normal file
After Width: | Height: | Size: 25 KiB |
@@ -1,6 +1,7 @@
|
|||||||
Flask==0.10.1
|
Flask>=0.10.1
|
||||||
gunicorn==19.0.0
|
gunicorn>=19.0.0
|
||||||
requests==2.4.1
|
requests>=2.4.1
|
||||||
GitPython==0.3.2.RC1
|
GitPython>=0.3.2.RC1
|
||||||
Yapsy>=1.10.423
|
gevent>=1.0.1
|
||||||
gevent>=1.0.1
|
PyLD>=0.6.5
|
||||||
|
Flask-Testing>=0.4.2
|
||||||
|
@@ -17,9 +17,3 @@
|
|||||||
"""
|
"""
|
||||||
Sentiment analysis server in Python
|
Sentiment analysis server in Python
|
||||||
"""
|
"""
|
||||||
|
|
||||||
import extensions
|
|
||||||
import blueprints
|
|
||||||
import plugins
|
|
||||||
|
|
||||||
__version__ = "0.2.8"
|
|
||||||
|
@@ -1,7 +1,81 @@
|
|||||||
|
#!/usr/bin/python
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
# Copyright 2014 J. Fernando Sánchez Rada - Grupo de Sistemas Inteligentes
|
||||||
|
# DIT, UPM
|
||||||
|
#
|
||||||
|
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
# you may not use this file except in compliance with the License.
|
||||||
|
# You may obtain a copy of the License at
|
||||||
|
#
|
||||||
|
# http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
#
|
||||||
|
# Unless required by applicable law or agreed to in writing, software
|
||||||
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
# See the License for the specific language governing permissions and
|
||||||
|
# limitations under the License.
|
||||||
|
"""
|
||||||
|
Senpy is a modular sentiment analysis server. This script runs an instance of
|
||||||
|
the server.
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
from flask import Flask
|
from flask import Flask
|
||||||
from extensions import Senpy
|
from senpy.extensions import Senpy
|
||||||
app = Flask(__name__)
|
from gevent.wsgi import WSGIServer
|
||||||
sp = Senpy()
|
from gevent.monkey import patch_all
|
||||||
sp.init_app(app)
|
import gevent
|
||||||
app.debug = True
|
import logging
|
||||||
app.run()
|
import os
|
||||||
|
import argparse
|
||||||
|
|
||||||
|
patch_all(thread=False)
|
||||||
|
|
||||||
|
def main():
|
||||||
|
parser = argparse.ArgumentParser(description='Run a Senpy server')
|
||||||
|
parser.add_argument('--level',
|
||||||
|
"-l",
|
||||||
|
metavar="logging_level",
|
||||||
|
type=str,
|
||||||
|
default="INFO",
|
||||||
|
help='Logging level')
|
||||||
|
parser.add_argument('--debug',
|
||||||
|
"-d",
|
||||||
|
action='store_true',
|
||||||
|
default=False,
|
||||||
|
help='Run the application in debug mode')
|
||||||
|
parser.add_argument('--default-plugins',
|
||||||
|
action='store_true',
|
||||||
|
default=False,
|
||||||
|
help='Load the default plugins')
|
||||||
|
parser.add_argument('--host',
|
||||||
|
type=str,
|
||||||
|
default="127.0.0.1",
|
||||||
|
help='Use 0.0.0.0 to accept requests from any host.')
|
||||||
|
parser.add_argument('--port',
|
||||||
|
'-p',
|
||||||
|
type=int,
|
||||||
|
default=5000,
|
||||||
|
help='Port to listen on.')
|
||||||
|
parser.add_argument('--plugins-folder',
|
||||||
|
'-f',
|
||||||
|
type=str,
|
||||||
|
default="plugins",
|
||||||
|
help='Where to look for plugins.')
|
||||||
|
args = parser.parse_args()
|
||||||
|
logging.basicConfig(level=getattr(logging, args.level))
|
||||||
|
app = Flask(__name__)
|
||||||
|
app.debug = args.debug
|
||||||
|
sp = Senpy(app, args.plugins_folder, default_plugins=args.default_plugins)
|
||||||
|
sp.activate_all()
|
||||||
|
http_server = WSGIServer((args.host, args.port), app)
|
||||||
|
try:
|
||||||
|
print("Server running on port %s:%d. Ctrl+C to quit" % (args.host,
|
||||||
|
args.port))
|
||||||
|
http_server.serve_forever()
|
||||||
|
except KeyboardInterrupt:
|
||||||
|
http_server.stop()
|
||||||
|
print("Bye!")
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
main()
|
||||||
|
@@ -17,19 +17,35 @@
|
|||||||
"""
|
"""
|
||||||
Blueprints for Senpy
|
Blueprints for Senpy
|
||||||
"""
|
"""
|
||||||
|
from flask import Blueprint, request, current_app
|
||||||
|
from .models import Error, Response, Leaf
|
||||||
|
|
||||||
import json
|
import json
|
||||||
import logging
|
import logging
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
from flask import Blueprint, request, jsonify, current_app
|
|
||||||
|
|
||||||
nif_blueprint = Blueprint("NIF Sentiment Analysis Server", __name__)
|
nif_blueprint = Blueprint("NIF Sentiment Analysis Server", __name__)
|
||||||
|
|
||||||
BASIC_PARAMS = {
|
BASIC_PARAMS = {
|
||||||
"algorithm": {"aliases": ["algorithm", "a", "algo"],
|
"algorithm": {
|
||||||
"required": False,
|
"aliases": ["algorithm", "a", "algo"],
|
||||||
},
|
"required": False,
|
||||||
|
},
|
||||||
|
"inHeaders": {
|
||||||
|
"aliases": ["inHeaders", "headers"],
|
||||||
|
"required": True,
|
||||||
|
"default": "0"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
LIST_PARAMS = {
|
||||||
|
"params": {
|
||||||
|
"aliases": ["params", "with_params"],
|
||||||
|
"required": False,
|
||||||
|
"default": "0"
|
||||||
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
@@ -44,34 +60,40 @@ def get_params(req, params=BASIC_PARAMS):
|
|||||||
outdict = {}
|
outdict = {}
|
||||||
wrong_params = {}
|
wrong_params = {}
|
||||||
for param, options in params.iteritems():
|
for param, options in params.iteritems():
|
||||||
for alias in options["aliases"]:
|
if param[0] != "@": # Exclude json-ld properties
|
||||||
if alias in indict:
|
logger.debug("Param: %s - Options: %s", param, options)
|
||||||
outdict[param] = indict[alias]
|
for alias in options["aliases"]:
|
||||||
if param not in outdict:
|
if alias in indict:
|
||||||
if options.get("required", False) and "default" not in options:
|
outdict[param] = indict[alias]
|
||||||
wrong_params[param] = params[param]
|
if param not in outdict:
|
||||||
|
if options.get("required", False) and "default" not in options:
|
||||||
|
wrong_params[param] = params[param]
|
||||||
|
else:
|
||||||
|
if "default" in options:
|
||||||
|
outdict[param] = options["default"]
|
||||||
else:
|
else:
|
||||||
if "default" in options:
|
if "options" in params[param] and \
|
||||||
outdict[param] = options["default"]
|
outdict[param] not in params[param]["options"]:
|
||||||
else:
|
wrong_params[param] = params[param]
|
||||||
if "options" in params[param] and outdict[param] not in params[param]["options"]:
|
|
||||||
wrong_params[param] = params[param]
|
|
||||||
if wrong_params:
|
if wrong_params:
|
||||||
message = {"status": "failed",
|
message = Error({"status": 404,
|
||||||
"message": "Missing or invalid parameters",
|
"message": "Missing or invalid parameters",
|
||||||
"parameters": outdict,
|
"parameters": outdict,
|
||||||
"errors": {param: error for param, error in wrong_params.iteritems()}
|
"errors": {param: error for param, error in
|
||||||
}
|
wrong_params.iteritems()}
|
||||||
raise ValueError(json.dumps(message))
|
})
|
||||||
|
raise ValueError(message)
|
||||||
return outdict
|
return outdict
|
||||||
|
|
||||||
|
|
||||||
def basic_analysis(params):
|
def basic_analysis(params):
|
||||||
response = {"@context": ["http://demos.gsi.dit.upm.es/eurosentiment/static/context.jsonld",
|
response = {"@context":
|
||||||
{
|
[("http://demos.gsi.dit.upm.es/"
|
||||||
"@base": "{}#".format(request.url.encode('utf-8'))
|
"eurosentiment/static/context.jsonld"),
|
||||||
}
|
{
|
||||||
],
|
"@base": "{}#".format(request.url.encode('utf-8'))
|
||||||
|
}
|
||||||
|
],
|
||||||
"analysis": [{"@type": "marl:SentimentAnalysis"}],
|
"analysis": [{"@type": "marl:SentimentAnalysis"}],
|
||||||
"entries": []
|
"entries": []
|
||||||
}
|
}
|
||||||
@@ -88,21 +110,28 @@ def basic_analysis(params):
|
|||||||
@nif_blueprint.route('/', methods=['POST', 'GET'])
|
@nif_blueprint.route('/', methods=['POST', 'GET'])
|
||||||
def home():
|
def home():
|
||||||
try:
|
try:
|
||||||
algo = get_params(request).get("algorithm", None)
|
params = get_params(request)
|
||||||
|
algo = params.get("algorithm", None)
|
||||||
specific_params = current_app.senpy.parameters(algo)
|
specific_params = current_app.senpy.parameters(algo)
|
||||||
params = get_params(request, specific_params)
|
logger.debug(
|
||||||
|
"Specific params: %s", json.dumps(specific_params, indent=4))
|
||||||
|
params.update(get_params(request, specific_params))
|
||||||
response = current_app.senpy.analyse(**params)
|
response = current_app.senpy.analyse(**params)
|
||||||
return jsonify(response)
|
in_headers = params["inHeaders"] != "0"
|
||||||
|
return response.flask(in_headers=in_headers)
|
||||||
except ValueError as ex:
|
except ValueError as ex:
|
||||||
return ex.message
|
return ex.message.flask()
|
||||||
except Exception as ex:
|
|
||||||
return jsonify(status="400", message=ex.message)
|
|
||||||
|
|
||||||
|
|
||||||
@nif_blueprint.route("/default")
|
@nif_blueprint.route("/default")
|
||||||
def default():
|
def default():
|
||||||
return current_app.senpy.default_plugin
|
# return current_app.senpy.default_plugin
|
||||||
#return plugins(action="list", plugin=current_app.senpy.default_algorithm)
|
plug = current_app.senpy.default_plugin
|
||||||
|
if plug:
|
||||||
|
return plugins(action="list", plugin=plug.name)
|
||||||
|
else:
|
||||||
|
error = Error(status=404, message="No plugins found")
|
||||||
|
return error.flask()
|
||||||
|
|
||||||
|
|
||||||
@nif_blueprint.route('/plugins/', methods=['POST', 'GET'])
|
@nif_blueprint.route('/plugins/', methods=['POST', 'GET'])
|
||||||
@@ -117,18 +146,21 @@ def plugins(plugin=None, action="list"):
|
|||||||
if plugin and not plugs:
|
if plugin and not plugs:
|
||||||
return "Plugin not found", 400
|
return "Plugin not found", 400
|
||||||
if action == "list":
|
if action == "list":
|
||||||
with_params = request.args.get("params", "") == "1"
|
with_params = get_params(request, LIST_PARAMS)["params"] == "1"
|
||||||
|
in_headers = get_params(request, BASIC_PARAMS)["inHeaders"] != "0"
|
||||||
if plugin:
|
if plugin:
|
||||||
dic = plugs[plugin].jsonable(with_params)
|
dic = plugs[plugin]
|
||||||
else:
|
else:
|
||||||
dic = {plug: plugs[plug].jsonable(with_params) for plug in plugs}
|
dic = Response(
|
||||||
return jsonify(dic)
|
{plug: plugs[plug].jsonld(with_params) for plug in plugs},
|
||||||
|
frame={})
|
||||||
|
return dic.flask(in_headers=in_headers)
|
||||||
method = "{}_plugin".format(action)
|
method = "{}_plugin".format(action)
|
||||||
if(hasattr(sp, method)):
|
if(hasattr(sp, method)):
|
||||||
getattr(sp, method)(plugin)
|
getattr(sp, method)(plugin)
|
||||||
return "Ok"
|
return Leaf(message="Ok").flask()
|
||||||
else:
|
else:
|
||||||
return "action '{}' not allowed".format(action), 400
|
return Error("action '{}' not allowed".format(action)).flask()
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
|
@@ -36,5 +36,7 @@
|
|||||||
},
|
},
|
||||||
"text": { "@id": "nif:isString" },
|
"text": { "@id": "nif:isString" },
|
||||||
"wnaffect": "http://www.gsi.dit.upm.es/ontologies/wnaffect#",
|
"wnaffect": "http://www.gsi.dit.upm.es/ontologies/wnaffect#",
|
||||||
"xsd": "http://www.w3.org/2001/XMLSchema#"
|
"xsd": "http://www.w3.org/2001/XMLSchema#",
|
||||||
|
"senpy": "http://www.gsi.dit.upm.es/ontologies/senpy/ns#",
|
||||||
|
"@vocab": "http://www.gsi.dit.upm.es/ontologies/senpy/ns#"
|
||||||
}
|
}
|
||||||
|
@@ -1,35 +1,43 @@
|
|||||||
"""
|
"""
|
||||||
"""
|
"""
|
||||||
|
import gevent
|
||||||
|
from gevent import monkey
|
||||||
|
monkey.patch_all()
|
||||||
|
|
||||||
|
from .plugins import SenpyPlugin, SentimentPlugin, EmotionPlugin
|
||||||
|
from .models import Error
|
||||||
|
from .blueprints import nif_blueprint
|
||||||
|
|
||||||
|
from git import Repo, InvalidGitRepositoryError
|
||||||
|
from functools import partial
|
||||||
|
|
||||||
import os
|
import os
|
||||||
import fnmatch
|
import fnmatch
|
||||||
import inspect
|
import inspect
|
||||||
import sys
|
import sys
|
||||||
import imp
|
import imp
|
||||||
import logging
|
import logging
|
||||||
|
import traceback
|
||||||
import gevent
|
import gevent
|
||||||
import json
|
import json
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
from .plugins import SenpyPlugin, SentimentPlugin, EmotionPlugin
|
|
||||||
|
|
||||||
from .blueprints import nif_blueprint
|
|
||||||
from git import Repo, InvalidGitRepositoryError
|
|
||||||
from functools import partial
|
|
||||||
|
|
||||||
|
|
||||||
class Senpy(object):
|
class Senpy(object):
|
||||||
|
|
||||||
""" Default Senpy extension for Flask """
|
""" Default Senpy extension for Flask """
|
||||||
|
|
||||||
def __init__(self, app=None, plugin_folder="plugins"):
|
def __init__(self, app=None, plugin_folder="plugins", default_plugins=False):
|
||||||
self.app = app
|
self.app = app
|
||||||
base_folder = os.path.join(os.path.dirname(__file__), "plugins")
|
|
||||||
|
|
||||||
self._search_folders = set()
|
self._search_folders = set()
|
||||||
self._outdated = True
|
self._outdated = True
|
||||||
|
|
||||||
for folder in (base_folder, plugin_folder):
|
self.add_folder(plugin_folder)
|
||||||
self.add_folder(folder)
|
if default_plugins:
|
||||||
|
base_folder = os.path.join(os.path.dirname(__file__), "plugins")
|
||||||
|
self.add_folder(base_folder)
|
||||||
|
|
||||||
if app is not None:
|
if app is not None:
|
||||||
self.init_app(app)
|
self.init_app(app)
|
||||||
@@ -65,30 +73,45 @@ class Senpy(object):
|
|||||||
if "algorithm" in params:
|
if "algorithm" in params:
|
||||||
algo = params["algorithm"]
|
algo = params["algorithm"]
|
||||||
elif self.plugins:
|
elif self.plugins:
|
||||||
algo = self.default_plugin
|
algo = self.default_plugin and self.default_plugin.name
|
||||||
|
if not algo:
|
||||||
|
return Error(status=404,
|
||||||
|
message=("No plugins found."
|
||||||
|
" Please install one.").format(algo))
|
||||||
if algo in self.plugins:
|
if algo in self.plugins:
|
||||||
if self.plugins[algo].is_activated:
|
if self.plugins[algo].is_activated:
|
||||||
plug = self.plugins[algo]
|
plug = self.plugins[algo]
|
||||||
resp = plug.analyse(**params)
|
resp = plug.analyse(**params)
|
||||||
resp.analysis.append(plug.jsonable())
|
resp.analysis.append(plug)
|
||||||
|
logger.debug("Returning analysis result: {}".format(resp))
|
||||||
return resp
|
return resp
|
||||||
logger.debug("Plugin not activated: {}".format(algo))
|
else:
|
||||||
|
logger.debug("Plugin not activated: {}".format(algo))
|
||||||
|
return Error(status=400,
|
||||||
|
message=("The algorithm '{}'"
|
||||||
|
" is not activated yet").format(algo))
|
||||||
else:
|
else:
|
||||||
logger.debug("The algorithm '{}' is not valid\nValid algorithms: {}".format(algo, self.plugins.keys()))
|
logger.debug(("The algorithm '{}' is not valid\n"
|
||||||
return {"status": 400, "message": "The algorithm '{}' is not valid".format(algo)}
|
"Valid algorithms: {}").format(algo,
|
||||||
|
self.plugins.keys()))
|
||||||
|
return Error(status=404,
|
||||||
|
message="The algorithm '{}' is not valid"
|
||||||
|
.format(algo))
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def default_plugin(self):
|
def default_plugin(self):
|
||||||
candidates = self.filter_plugins(is_activated=True)
|
candidates = self.filter_plugins(is_activated=True)
|
||||||
if len(candidates) > 0:
|
if len(candidates) > 0:
|
||||||
candidate = candidates.keys()[0]
|
candidate = candidates.values()[0]
|
||||||
logger.debug("Default: {}".format(candidate))
|
logger.debug("Default: {}".format(candidate))
|
||||||
return candidate
|
return candidate
|
||||||
else:
|
else:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
def parameters(self, algo):
|
def parameters(self, algo):
|
||||||
return getattr(self.plugins.get(algo or self.default_plugin), "params", {})
|
return getattr(self.plugins.get(algo) or self.default_plugin,
|
||||||
|
"params",
|
||||||
|
{})
|
||||||
|
|
||||||
def activate_all(self, sync=False):
|
def activate_all(self, sync=False):
|
||||||
ps = []
|
ps = []
|
||||||
@@ -107,7 +130,14 @@ class Senpy(object):
|
|||||||
|
|
||||||
def activate_plugin(self, plugin_name, sync=False):
|
def activate_plugin(self, plugin_name, sync=False):
|
||||||
plugin = self.plugins[plugin_name]
|
plugin = self.plugins[plugin_name]
|
||||||
th = gevent.spawn(plugin.activate)
|
def act():
|
||||||
|
try:
|
||||||
|
plugin.activate()
|
||||||
|
except Exception as ex:
|
||||||
|
logger.error("Error activating plugin {}: {}".format(plugin.name,
|
||||||
|
ex))
|
||||||
|
logger.error("Trace: {}".format(traceback.format_exc()))
|
||||||
|
th = gevent.spawn(act)
|
||||||
th.link_value(partial(self._set_active_plugin, plugin_name, True))
|
th.link_value(partial(self._set_active_plugin, plugin_name, True))
|
||||||
if sync:
|
if sync:
|
||||||
th.join()
|
th.join()
|
||||||
@@ -134,20 +164,22 @@ class Senpy(object):
|
|||||||
def _load_plugin(root, filename):
|
def _load_plugin(root, filename):
|
||||||
logger.debug("Loading plugin: {}".format(filename))
|
logger.debug("Loading plugin: {}".format(filename))
|
||||||
fpath = os.path.join(root, filename)
|
fpath = os.path.join(root, filename)
|
||||||
with open(fpath,'r') as f:
|
with open(fpath, 'r') as f:
|
||||||
info = json.load(f)
|
info = json.load(f)
|
||||||
logger.debug("Info: {}".format(info))
|
logger.debug("Info: {}".format(info))
|
||||||
sys.path.append(root)
|
sys.path.append(root)
|
||||||
module = info["module"]
|
module = info["module"]
|
||||||
name = info["name"]
|
name = info["name"]
|
||||||
(fp, pathname, desc) = imp.find_module(module, [root,])
|
(fp, pathname, desc) = imp.find_module(module, [root, ])
|
||||||
try:
|
try:
|
||||||
tmp = imp.load_module(module, fp, pathname, desc)
|
tmp = imp.load_module(module, fp, pathname, desc)
|
||||||
sys.path.remove(root)
|
sys.path.remove(root)
|
||||||
candidate = None
|
candidate = None
|
||||||
for _, obj in inspect.getmembers(tmp):
|
for _, obj in inspect.getmembers(tmp):
|
||||||
if inspect.isclass(obj) and inspect.getmodule(obj) == tmp:
|
if inspect.isclass(obj) and inspect.getmodule(obj) == tmp:
|
||||||
logger.debug("Found plugin class: {}@{}".format(obj, inspect.getmodule(obj)))
|
logger.debug(("Found plugin class:"
|
||||||
|
" {}@{}").format(obj, inspect.getmodule(obj))
|
||||||
|
)
|
||||||
candidate = obj
|
candidate = obj
|
||||||
break
|
break
|
||||||
if not candidate:
|
if not candidate:
|
||||||
@@ -156,11 +188,12 @@ class Senpy(object):
|
|||||||
module = candidate(info=info)
|
module = candidate(info=info)
|
||||||
try:
|
try:
|
||||||
repo_path = root
|
repo_path = root
|
||||||
module.repo = Repo(repo_path)
|
module._repo = Repo(repo_path)
|
||||||
except InvalidGitRepositoryError:
|
except InvalidGitRepositoryError:
|
||||||
module.repo = None
|
module._repo = None
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
logger.debug("Exception importing {}: {}".format(filename, ex))
|
logger.error("Exception importing {}: {}".format(filename, ex))
|
||||||
|
logger.error("Trace: {}".format(traceback.format_exc()))
|
||||||
return None, None
|
return None, None
|
||||||
return name, module
|
return name, module
|
||||||
|
|
||||||
|
212
senpy/models.py
@@ -1,19 +1,41 @@
|
|||||||
import json
|
import json
|
||||||
import os
|
import os
|
||||||
from collections import defaultdict
|
from collections import defaultdict
|
||||||
|
from pyld import jsonld
|
||||||
|
import logging
|
||||||
|
from flask import Response as FlaskResponse
|
||||||
|
|
||||||
|
|
||||||
class Leaf(defaultdict):
|
class Leaf(dict):
|
||||||
_prefix = None
|
_prefix = None
|
||||||
|
_frame = {}
|
||||||
|
_context = {}
|
||||||
|
|
||||||
def __init__(self, context=None, prefix=None, ofclass=list):
|
def __init__(self,
|
||||||
super(Leaf, self).__init__(ofclass)
|
*args,
|
||||||
if context:
|
**kwargs):
|
||||||
|
|
||||||
|
id = kwargs.pop("id", None)
|
||||||
|
context = kwargs.pop("context", self._context)
|
||||||
|
vocab = kwargs.pop("vocab", None)
|
||||||
|
prefix = kwargs.pop("prefix", None)
|
||||||
|
frame = kwargs.pop("frame", None)
|
||||||
|
super(Leaf, self).__init__(*args, **kwargs)
|
||||||
|
if context is not None:
|
||||||
self.context = context
|
self.context = context
|
||||||
|
if frame is not None:
|
||||||
|
self._frame = frame
|
||||||
self._prefix = prefix
|
self._prefix = prefix
|
||||||
|
self.id = id
|
||||||
|
|
||||||
def __getattr__(self, key):
|
def __getattr__(self, key):
|
||||||
return super(Leaf, self).__getitem__(self._get_key(key))
|
try:
|
||||||
|
return object.__getattr__(self, key)
|
||||||
|
except AttributeError:
|
||||||
|
try:
|
||||||
|
return super(Leaf, self).__getitem__(self._get_key(key))
|
||||||
|
except KeyError:
|
||||||
|
raise AttributeError()
|
||||||
|
|
||||||
def __setattr__(self, key, value):
|
def __setattr__(self, key, value):
|
||||||
try:
|
try:
|
||||||
@@ -21,20 +43,44 @@ class Leaf(defaultdict):
|
|||||||
object.__setattr__(self, key, value)
|
object.__setattr__(self, key, value)
|
||||||
except AttributeError:
|
except AttributeError:
|
||||||
key = self._get_key(key)
|
key = self._get_key(key)
|
||||||
value = self.get_context(value) if key == "@context" else value
|
if key == "@context":
|
||||||
|
value = self.get_context(value)
|
||||||
|
elif key == "@id":
|
||||||
|
value = self.get_id(value)
|
||||||
if key[0] == "_":
|
if key[0] == "_":
|
||||||
object.__setattr__(self, key, value)
|
object.__setattr__(self, key, value)
|
||||||
else:
|
else:
|
||||||
super(Leaf, self).__setitem__(key, value)
|
if value is None:
|
||||||
|
try:
|
||||||
|
super(Leaf, self).__delitem__(key)
|
||||||
|
except KeyError:
|
||||||
|
pass
|
||||||
|
else:
|
||||||
|
super(Leaf, self).__setitem__(key, value)
|
||||||
|
|
||||||
|
def get_id(self, id):
|
||||||
|
"""
|
||||||
|
Get id, dealing with prefixes
|
||||||
|
"""
|
||||||
|
# This is not the most elegant solution to change the @id attribute,
|
||||||
|
# but it is the quickest way to have it included in the dictionary
|
||||||
|
# without extra boilerplate.
|
||||||
|
if id and self._prefix and ":" not in id:
|
||||||
|
return "{}{}".format(self._prefix, id)
|
||||||
|
else:
|
||||||
|
return id
|
||||||
|
|
||||||
def __delattr__(self, key):
|
def __delattr__(self, key):
|
||||||
return super(Leaf, self).__delitem__(self._get_key(key))
|
if key in self.__dict__:
|
||||||
|
del self.__dict__[key]
|
||||||
|
else:
|
||||||
|
super(Leaf, self).__delitem__(self._get_key(key))
|
||||||
|
|
||||||
def _get_key(self, key):
|
def _get_key(self, key):
|
||||||
if key is "context":
|
if key[0] == "_":
|
||||||
return "@context"
|
return key
|
||||||
elif self._prefix:
|
elif key in ["context", "id"]:
|
||||||
return "{}:{}".format(self._prefix, key)
|
return "@{}".format(key)
|
||||||
else:
|
else:
|
||||||
return key
|
return key
|
||||||
|
|
||||||
@@ -54,53 +100,153 @@ class Leaf(defaultdict):
|
|||||||
except IOError:
|
except IOError:
|
||||||
return context
|
return context
|
||||||
|
|
||||||
|
def compact(self):
|
||||||
|
return jsonld.compact(self, self.get_context(self.context))
|
||||||
|
|
||||||
|
def frame(self, frame=None, options=None):
|
||||||
|
if frame is None:
|
||||||
|
frame = self._frame
|
||||||
|
if options is None:
|
||||||
|
options = {}
|
||||||
|
return jsonld.frame(self, frame, options)
|
||||||
|
|
||||||
|
def jsonld(self, frame=None, options=None,
|
||||||
|
context=None, removeContext=None):
|
||||||
|
if removeContext is None:
|
||||||
|
removeContext = Response._context # Loop?
|
||||||
|
if frame is None:
|
||||||
|
frame = self._frame
|
||||||
|
if context is None:
|
||||||
|
context = self.context
|
||||||
|
else:
|
||||||
|
context = self.get_context(context)
|
||||||
|
# For some reason, this causes errors with pyld
|
||||||
|
# if options is None:
|
||||||
|
# options = {"expandContext": context.copy() }
|
||||||
|
js = self
|
||||||
|
if frame:
|
||||||
|
logging.debug("Framing: %s", json.dumps(self, indent=4))
|
||||||
|
logging.debug("Framing with %s", json.dumps(frame, indent=4))
|
||||||
|
js = jsonld.frame(js, frame, options)
|
||||||
|
logging.debug("Result: %s", json.dumps(js, indent=4))
|
||||||
|
logging.debug("Compacting with %s", json.dumps(context, indent=4))
|
||||||
|
js = jsonld.compact(js, context, options)
|
||||||
|
logging.debug("Result: %s", json.dumps(js, indent=4))
|
||||||
|
if removeContext == context:
|
||||||
|
del js["@context"]
|
||||||
|
return js
|
||||||
|
|
||||||
|
def to_JSON(self, removeContext=None):
|
||||||
|
return json.dumps(self.jsonld(removeContext=removeContext),
|
||||||
|
default=lambda o: o.__dict__,
|
||||||
|
sort_keys=True, indent=4)
|
||||||
|
|
||||||
|
def flask(self,
|
||||||
|
in_headers=False,
|
||||||
|
url="http://demos.gsi.dit.upm.es/senpy/senpy.jsonld"):
|
||||||
|
"""
|
||||||
|
Return the values and error to be used in flask
|
||||||
|
"""
|
||||||
|
js = self.jsonld()
|
||||||
|
headers = None
|
||||||
|
if in_headers:
|
||||||
|
ctx = js["@context"]
|
||||||
|
headers = {
|
||||||
|
"Link": ('<%s>;'
|
||||||
|
'rel="http://www.w3.org/ns/json-ld#context";'
|
||||||
|
' type="application/ld+json"' % url)
|
||||||
|
}
|
||||||
|
del js["@context"]
|
||||||
|
return FlaskResponse(json.dumps(js, indent=4),
|
||||||
|
status=self.get("status", 200),
|
||||||
|
headers=headers,
|
||||||
|
mimetype="application/json")
|
||||||
|
|
||||||
|
|
||||||
class Response(Leaf):
|
class Response(Leaf):
|
||||||
def __init__(self, context=None, *args, **kwargs):
|
_context = Leaf.get_context("{}/context.jsonld".format(
|
||||||
|
os.path.dirname(os.path.realpath(__file__))))
|
||||||
|
_frame = {
|
||||||
|
"@context": _context,
|
||||||
|
"analysis": {
|
||||||
|
"@explicit": True,
|
||||||
|
"maxPolarityValue": {},
|
||||||
|
"minPolarityValue": {},
|
||||||
|
"name": {},
|
||||||
|
"version": {},
|
||||||
|
},
|
||||||
|
"entries": {}
|
||||||
|
}
|
||||||
|
|
||||||
|
def __init__(self, *args, **kwargs):
|
||||||
|
context = kwargs.pop("context", None)
|
||||||
|
frame = kwargs.pop("frame", None)
|
||||||
if context is None:
|
if context is None:
|
||||||
context = "{}/context.jsonld".format(os.path.dirname(
|
context = self._context
|
||||||
os.path.realpath(__file__)))
|
self.context = context
|
||||||
super(Response, self).__init__(*args, context=context, **kwargs)
|
super(Response, self).__init__(
|
||||||
self["analysis"] = []
|
*args, context=context, frame=frame, **kwargs)
|
||||||
self["entries"] = []
|
if self._frame is not None and "entries" in self._frame:
|
||||||
|
self.analysis = []
|
||||||
|
self.entries = []
|
||||||
|
|
||||||
|
def jsonld(self, frame=None, options=None, context=None, removeContext={}):
|
||||||
|
return super(Response, self).jsonld(frame,
|
||||||
|
options,
|
||||||
|
context,
|
||||||
|
removeContext)
|
||||||
|
|
||||||
|
|
||||||
class Entry(Leaf):
|
class Entry(Leaf):
|
||||||
|
_context = {
|
||||||
|
"@vocab": ("http://persistence.uni-leipzig.org/"
|
||||||
|
"nlp2rdf/ontologies/nif-core#")
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
def __init__(self, text=None, emotion_sets=None, opinions=None, **kwargs):
|
def __init__(self, text=None, emotion_sets=None, opinions=None, **kwargs):
|
||||||
super(Entry, self).__init__(**kwargs)
|
super(Entry, self).__init__(**kwargs)
|
||||||
if text:
|
if text:
|
||||||
self.text = text
|
self.text = text
|
||||||
if emotion_sets:
|
self.emotionSets = emotion_sets if emotion_sets else []
|
||||||
self.emotionSets = emotion_sets
|
self.opinions = opinions if opinions else []
|
||||||
if opinions:
|
|
||||||
self.opinions = opinions
|
|
||||||
|
|
||||||
|
|
||||||
class Opinion(Leaf):
|
class Opinion(Leaf):
|
||||||
opinionContext = {"@vocab": "http://www.gsi.dit.upm.es/ontologies/marl/ns#"}
|
_context = {
|
||||||
|
"@vocab": "http://www.gsi.dit.upm.es/ontologies/marl/ns#"
|
||||||
|
}
|
||||||
|
|
||||||
def __init__(self, polarityValue=None, hasPolarity=None, *args, **kwargs):
|
def __init__(self, polarityValue=None, hasPolarity=None, *args, **kwargs):
|
||||||
super(Opinion, self).__init__(context=self.opinionContext,
|
super(Opinion, self).__init__(*args,
|
||||||
*args,
|
|
||||||
**kwargs)
|
**kwargs)
|
||||||
if polarityValue is not None:
|
if polarityValue is not None:
|
||||||
self.polarityValue = polarityValue
|
self.hasPolarityValue = polarityValue
|
||||||
if hasPolarity is not None:
|
if hasPolarity is not None:
|
||||||
self.hasPolarity = hasPolarity
|
self.hasPolarity = hasPolarity
|
||||||
|
|
||||||
|
|
||||||
class EmotionSet(Leaf):
|
class EmotionSet(Leaf):
|
||||||
emotionContext = {}
|
_context = {}
|
||||||
|
|
||||||
def __init__(self, emotions=None, *args, **kwargs):
|
def __init__(self, emotions=None, *args, **kwargs):
|
||||||
if not emotions:
|
if not emotions:
|
||||||
emotions = []
|
emotions = []
|
||||||
super(EmotionSet, self).__init__(context=EmotionSet.emotionContext,
|
super(EmotionSet, self).__init__(context=EmotionSet._context,
|
||||||
*args,
|
*args,
|
||||||
**kwargs)
|
**kwargs)
|
||||||
self.emotions = emotions or []
|
self.emotions = emotions or []
|
||||||
|
|
||||||
|
|
||||||
class Emotion(Leaf):
|
class Emotion(Leaf):
|
||||||
emotionContext = {}
|
_context = {}
|
||||||
def __init__(self, emotions=None, *args, **kwargs):
|
|
||||||
super(EmotionSet, self).__init__(context=Emotion.emotionContext,
|
|
||||||
*args,
|
class Error(Leaf):
|
||||||
**kwargs)
|
# A better pattern would be this:
|
||||||
|
# http://flask.pocoo.org/docs/0.10/patterns/apierrors/
|
||||||
|
_frame = {}
|
||||||
|
_context = {}
|
||||||
|
|
||||||
|
def __init__(self, *args, **kwargs):
|
||||||
|
super(Error, self).__init__(*args, **kwargs)
|
||||||
|
197
senpy/plugins.py
@@ -1,55 +1,95 @@
|
|||||||
|
|
||||||
|
import inspect
|
||||||
|
import os.path
|
||||||
|
import shelve
|
||||||
import logging
|
import logging
|
||||||
import ConfigParser
|
import ConfigParser
|
||||||
|
from .models import Response, Leaf
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
PARAMS = {"input": {"aliases": ["i", "input"],
|
PARAMS = {
|
||||||
"required": True,
|
"input": {
|
||||||
"help": "Input text"
|
"@id": "input",
|
||||||
},
|
"aliases": ["i", "input"],
|
||||||
"informat": {"aliases": ["f", "informat"],
|
"required": True,
|
||||||
"required": False,
|
"help": "Input text"
|
||||||
"default": "text",
|
},
|
||||||
"options": ["turtle", "text"],
|
"informat": {
|
||||||
},
|
"@id": "informat",
|
||||||
"intype": {"aliases": ["intype", "t"],
|
"aliases": ["f", "informat"],
|
||||||
"required": False,
|
"required": False,
|
||||||
"default": "direct",
|
"default": "text",
|
||||||
"options": ["direct", "url", "file"],
|
"options": ["turtle", "text"],
|
||||||
},
|
},
|
||||||
"outformat": {"aliases": ["outformat", "o"],
|
"intype": {
|
||||||
"default": "json-ld",
|
"@id": "intype",
|
||||||
"required": False,
|
"aliases": ["intype", "t"],
|
||||||
"options": ["json-ld"],
|
"required": False,
|
||||||
},
|
"default": "direct",
|
||||||
"language": {"aliases": ["language", "l"],
|
"options": ["direct", "url", "file"],
|
||||||
"required": False,
|
},
|
||||||
"options": ["es", "en"],
|
"outformat": {
|
||||||
},
|
"@id": "outformat",
|
||||||
"prefix": {"aliases": ["prefix", "p"],
|
"aliases": ["outformat", "o"],
|
||||||
"required": True,
|
"default": "json-ld",
|
||||||
"default": "",
|
"required": False,
|
||||||
},
|
"options": ["json-ld"],
|
||||||
"urischeme": {"aliases": ["urischeme", "u"],
|
},
|
||||||
"required": False,
|
"language": {
|
||||||
"default": "RFC5147String",
|
"@id": "language",
|
||||||
"options": "RFC5147String"
|
"aliases": ["language", "l"],
|
||||||
},
|
"required": False,
|
||||||
}
|
},
|
||||||
|
"prefix": {
|
||||||
|
"@id": "prefix",
|
||||||
|
"aliases": ["prefix", "p"],
|
||||||
|
"required": True,
|
||||||
|
"default": "",
|
||||||
|
},
|
||||||
|
"urischeme": {
|
||||||
|
"@id": "urischeme",
|
||||||
|
"aliases": ["urischeme", "u"],
|
||||||
|
"required": False,
|
||||||
|
"default": "RFC5147String",
|
||||||
|
"options": "RFC5147String"
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
class SenpyPlugin(object):
|
class SenpyPlugin(Leaf):
|
||||||
|
_context = Leaf.get_context(Response._context)
|
||||||
|
_frame = {"@context": _context,
|
||||||
|
"name": {},
|
||||||
|
"extra_params": {"@container": "@index"},
|
||||||
|
"@explicit": True,
|
||||||
|
"version": {},
|
||||||
|
"repo": None,
|
||||||
|
"is_activated": {},
|
||||||
|
"params": None,
|
||||||
|
}
|
||||||
|
|
||||||
def __init__(self, info=None):
|
def __init__(self, info=None):
|
||||||
if not info:
|
if not info:
|
||||||
raise ValueError("You need to provide configuration information for the plugin.")
|
raise ValueError(("You need to provide configuration"
|
||||||
|
"information for the plugin."))
|
||||||
logger.debug("Initialising {}".format(info))
|
logger.debug("Initialising {}".format(info))
|
||||||
|
super(SenpyPlugin, self).__init__()
|
||||||
self.name = info["name"]
|
self.name = info["name"]
|
||||||
self.version = info["version"]
|
self.version = info["version"]
|
||||||
|
self.id = "{}_{}".format(self.name, self.version)
|
||||||
self.params = info.get("params", PARAMS.copy())
|
self.params = info.get("params", PARAMS.copy())
|
||||||
|
if "@id" not in self.params:
|
||||||
|
self.params["@id"] = "params_%s" % self.id
|
||||||
self.extra_params = info.get("extra_params", {})
|
self.extra_params = info.get("extra_params", {})
|
||||||
self.params.update(self.extra_params)
|
self.params.update(self.extra_params.copy())
|
||||||
|
if "@id" not in self.extra_params:
|
||||||
|
self.extra_params["@id"] = "extra_params_%s" % self.id
|
||||||
self.is_activated = False
|
self.is_activated = False
|
||||||
self.info = info
|
self._info = info
|
||||||
|
|
||||||
|
def get_folder(self):
|
||||||
|
return os.path.dirname(inspect.getfile(self.__class__))
|
||||||
|
|
||||||
def analyse(self, *args, **kwargs):
|
def analyse(self, *args, **kwargs):
|
||||||
logger.debug("Analysing with: {} {}".format(self.name, self.version))
|
logger.debug("Analysing with: {} {}".format(self.name, self.version))
|
||||||
@@ -61,54 +101,59 @@ class SenpyPlugin(object):
|
|||||||
def deactivate(self):
|
def deactivate(self):
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
def jsonld(self, parameters=False, *args, **kwargs):
|
||||||
|
nframe = kwargs.pop("frame", self._frame)
|
||||||
|
if parameters:
|
||||||
|
nframe = nframe.copy()
|
||||||
|
nframe["params"] = {}
|
||||||
|
return super(SenpyPlugin, self).jsonld(frame=nframe, *args, **kwargs)
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def id(self):
|
def id(self):
|
||||||
return "{}_{}".format(self.name, self.version)
|
return "{}_{}".format(self.name, self.version)
|
||||||
|
|
||||||
def jsonable(self, parameters=False):
|
def __del__(self):
|
||||||
resp = {
|
''' Destructor, to make sure all the resources are freed '''
|
||||||
"@id": "{}_{}".format(self.name, self.version),
|
self.deactivate()
|
||||||
"is_activated": self.is_activated,
|
|
||||||
}
|
|
||||||
if hasattr(self, "repo") and self.repo:
|
|
||||||
resp["repo"] = self.repo.remotes[0].url
|
|
||||||
if parameters:
|
|
||||||
resp["parameters"] = self.params
|
|
||||||
elif self.extra_params:
|
|
||||||
resp["extra_parameters"] = self.extra_params
|
|
||||||
return resp
|
|
||||||
|
|
||||||
|
|
||||||
class SentimentPlugin(SenpyPlugin):
|
class SentimentPlugin(SenpyPlugin):
|
||||||
def __init__(self,
|
|
||||||
min_polarity_value=0,
|
|
||||||
max_polarity_value=1,
|
|
||||||
**kwargs):
|
|
||||||
super(SentimentPlugin, self).__init__(**kwargs)
|
|
||||||
self.minPolarityValue = min_polarity_value
|
|
||||||
self.maxPolarityValue = max_polarity_value
|
|
||||||
|
|
||||||
def jsonable(self, *args, **kwargs):
|
def __init__(self, info, *args, **kwargs):
|
||||||
resp = super(SentimentPlugin, self).jsonable(*args, **kwargs)
|
super(SentimentPlugin, self).__init__(info, *args, **kwargs)
|
||||||
resp["marl:maxPolarityValue"] = self.maxPolarityValue
|
self.minPolarityValue = float(info.get("minPolarityValue", 0))
|
||||||
resp["marl:minPolarityValue"] = self.minPolarityValue
|
self.maxPolarityValue = float(info.get("maxPolarityValue", 1))
|
||||||
return resp
|
|
||||||
|
|
||||||
|
|
||||||
class EmotionPlugin(SenpyPlugin):
|
class EmotionPlugin(SenpyPlugin):
|
||||||
def __init__(self,
|
|
||||||
min_emotion_value=0,
|
|
||||||
max_emotion_value=1,
|
|
||||||
emotion_category=None,
|
|
||||||
**kwargs):
|
|
||||||
super(EmotionPlugin, self).__init__(**kwargs)
|
|
||||||
self.minEmotionValue = min_emotion_value
|
|
||||||
self.maxEmotionValue = max_emotion_value
|
|
||||||
self.emotionCategory = emotion_category
|
|
||||||
|
|
||||||
def jsonable(self, *args, **kwargs):
|
def __init__(self, info, *args, **kwargs):
|
||||||
resp = super(EmotionPlugin, self).jsonable(*args, **kwargs)
|
resp = super(EmotionPlugin, self).__init__(info, *args, **kwargs)
|
||||||
resp["onyx:minEmotionValue"] = self.minEmotionValue
|
self.minEmotionValue = float(info.get("minEmotionValue", 0))
|
||||||
resp["onyx:maxEmotionValue"] = self.maxEmotionValue
|
self.maxEmotionValue = float(info.get("maxEmotionValue", 0))
|
||||||
return resp
|
|
||||||
|
|
||||||
|
class ShelfMixin(object):
|
||||||
|
|
||||||
|
@property
|
||||||
|
def sh(self):
|
||||||
|
if not hasattr(self, '_sh') or not self._sh:
|
||||||
|
self._sh = shelve.open(self.shelf_file, writeback=True)
|
||||||
|
return self._sh
|
||||||
|
|
||||||
|
@sh.deleter
|
||||||
|
def sh(self):
|
||||||
|
if os.path.isfile(self.shelf_file):
|
||||||
|
os.remove(self.shelf_file)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def shelf_file(self):
|
||||||
|
if not hasattr(self, '_shelf_file') or not self._shelf_file:
|
||||||
|
if hasattr(self, '_info') and 'shelf_file' in self._info:
|
||||||
|
self._shelf_file = self._info['shelf_file']
|
||||||
|
else:
|
||||||
|
self._shelf_file = os.path.join(self.get_folder(), self.name + '.db')
|
||||||
|
return self._shelf_file
|
||||||
|
|
||||||
|
def close(self):
|
||||||
|
self.sh.close()
|
||||||
|
del(self._sh)
|
||||||
|
31
senpy/plugins/rand/rand.py
Normal file
@@ -0,0 +1,31 @@
|
|||||||
|
import json
|
||||||
|
import random
|
||||||
|
|
||||||
|
from senpy.plugins import SentimentPlugin
|
||||||
|
from senpy.models import Response, Opinion, Entry
|
||||||
|
|
||||||
|
|
||||||
|
class Sentiment140Plugin(SentimentPlugin):
|
||||||
|
def analyse(self, **params):
|
||||||
|
lang = params.get("language", "auto")
|
||||||
|
|
||||||
|
p = params.get("prefix", None)
|
||||||
|
response = Response(prefix=p)
|
||||||
|
polarity_value = max(-1, min(1, random.gauss(0.2, 0.2)))
|
||||||
|
polarity = "marl:Neutral"
|
||||||
|
if polarity_value > 0:
|
||||||
|
polarity = "marl:Positive"
|
||||||
|
elif polarity_value < 0:
|
||||||
|
polarity = "marl:Negative"
|
||||||
|
entry = Entry(id="Entry0",
|
||||||
|
text=params["input"],
|
||||||
|
prefix=p)
|
||||||
|
opinion = Opinion(id="Opinion0",
|
||||||
|
prefix=p,
|
||||||
|
hasPolarity=polarity,
|
||||||
|
polarityValue=polarity_value)
|
||||||
|
opinion["prov:wasGeneratedBy"] = self.id
|
||||||
|
entry.opinions.append(opinion)
|
||||||
|
entry.language = lang
|
||||||
|
response.entries.append(entry)
|
||||||
|
return response
|
18
senpy/plugins/rand/rand.senpy
Normal file
@@ -0,0 +1,18 @@
|
|||||||
|
{
|
||||||
|
"name": "rand",
|
||||||
|
"module": "rand",
|
||||||
|
"description": "What my plugin broadly does",
|
||||||
|
"author": "@balkian",
|
||||||
|
"version": "0.1",
|
||||||
|
"extra_params": {
|
||||||
|
"language": {
|
||||||
|
"@id": "lang_rand",
|
||||||
|
"aliases": ["language", "l"],
|
||||||
|
"required": false,
|
||||||
|
"options": ["es", "en", "auto"]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"requirements": {},
|
||||||
|
"marl:maxPolarityValue": "1",
|
||||||
|
"marl:minPolarityValue": "-1"
|
||||||
|
}
|
@@ -15,18 +15,23 @@ class Sentiment140Plugin(SentimentPlugin):
|
|||||||
)
|
)
|
||||||
)
|
)
|
||||||
|
|
||||||
response = Response()
|
p = params.get("prefix", None)
|
||||||
polarity_value = int(res.json()["data"][0]["polarity"]) * 25
|
response = Response(prefix=p)
|
||||||
|
polarity_value = self.maxPolarityValue*int(res.json()["data"][0]
|
||||||
|
["polarity"]) * 0.25
|
||||||
polarity = "marl:Neutral"
|
polarity = "marl:Neutral"
|
||||||
if polarity_value > 50:
|
neutral_value = self.maxPolarityValue / 2.0
|
||||||
|
if polarity_value > neutral_value:
|
||||||
polarity = "marl:Positive"
|
polarity = "marl:Positive"
|
||||||
elif polarity_value < 50:
|
elif polarity_value < neutral_value:
|
||||||
polarity = "marl:Negative"
|
polarity = "marl:Negative"
|
||||||
entry = Entry(text=params["input"],
|
entry = Entry(id="Entry0",
|
||||||
prefix=params.get("prefix", ""))
|
text=params["input"],
|
||||||
opinion = Opinion(hasPolarity=polarity,
|
prefix=p)
|
||||||
polarityValue=polarity_value,
|
opinion = Opinion(id="Opinion0",
|
||||||
prefix=params.get("prefix", ""))
|
prefix=p,
|
||||||
|
hasPolarity=polarity,
|
||||||
|
polarityValue=polarity_value)
|
||||||
opinion["prov:wasGeneratedBy"] = self.id
|
opinion["prov:wasGeneratedBy"] = self.id
|
||||||
entry.opinions.append(opinion)
|
entry.opinions.append(opinion)
|
||||||
entry.language = lang
|
entry.language = lang
|
@@ -6,10 +6,13 @@
|
|||||||
"version": "0.1",
|
"version": "0.1",
|
||||||
"extra_params": {
|
"extra_params": {
|
||||||
"language": {
|
"language": {
|
||||||
|
"@id": "lang_sentiment140",
|
||||||
"aliases": ["language", "l"],
|
"aliases": ["language", "l"],
|
||||||
"required": false,
|
"required": false,
|
||||||
"options": ["es", "en", "auto"]
|
"options": ["es", "en", "auto"]
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"requirements": {}
|
"requirements": {},
|
||||||
|
"maxPolarityValue": "1",
|
||||||
|
"minPolarityValue": "0"
|
||||||
}
|
}
|
@@ -1,2 +1,2 @@
|
|||||||
[metadata]
|
[metadata]
|
||||||
description-file = README.md
|
description-file = README.rst
|
||||||
|
35
setup.py
@@ -1,16 +1,21 @@
|
|||||||
|
import pip
|
||||||
from setuptools import setup
|
from setuptools import setup
|
||||||
from pip.req import parse_requirements
|
from pip.req import parse_requirements
|
||||||
|
|
||||||
# parse_requirements() returns generator of pip.req.InstallRequirement objects
|
# parse_requirements() returns generator of pip.req.InstallRequirement objects
|
||||||
install_reqs = parse_requirements("requirements.txt")
|
|
||||||
|
try:
|
||||||
|
install_reqs = parse_requirements("requirements.txt", session=pip.download.PipSession())
|
||||||
|
test_reqs = parse_requirements("test-requirements.txt", session=pip.download.PipSession())
|
||||||
|
except AttributeError:
|
||||||
|
install_reqs = parse_requirements("requirements.txt")
|
||||||
|
test_reqs = parse_requirements("test-requirements.txt")
|
||||||
|
|
||||||
# reqs is a list of requirement
|
# reqs is a list of requirement
|
||||||
# e.g. ['django==1.5.1', 'mezzanine==1.4.6']
|
# e.g. ['django==1.5.1', 'mezzanine==1.4.6']
|
||||||
reqs = [str(ir.req) for ir in install_reqs]
|
install_reqs = [str(ir.req) for ir in install_reqs]
|
||||||
|
test_reqs = [str(ir.req) for ir in test_reqs]
|
||||||
|
|
||||||
VERSION = "0.3.0"
|
VERSION = "0.4.11"
|
||||||
|
|
||||||
print(reqs)
|
|
||||||
|
|
||||||
setup(
|
setup(
|
||||||
name='senpy',
|
name='senpy',
|
||||||
@@ -22,10 +27,18 @@ extendable, so new algorithms and sources can be used.
|
|||||||
''',
|
''',
|
||||||
author='J. Fernando Sanchez',
|
author='J. Fernando Sanchez',
|
||||||
author_email='balkian@gmail.com',
|
author_email='balkian@gmail.com',
|
||||||
url='https://github.com/balkian/senpy', # use the URL to the github repo
|
url='https://github.com/gsi-upm/senpy', # use the URL to the github repo
|
||||||
download_url='https://github.com/balkian/senpy/archive/{}.tar.gz'.format(VERSION),
|
download_url='https://github.com/gsi-upm/senpy/archive/{}.tar.gz'
|
||||||
keywords=['eurosentiment', 'sentiment', 'emotions', 'nif'], # arbitrary keywords
|
.format(VERSION),
|
||||||
|
keywords=['eurosentiment', 'sentiment', 'emotions', 'nif'],
|
||||||
classifiers=[],
|
classifiers=[],
|
||||||
install_requires=reqs,
|
install_requires=install_reqs,
|
||||||
include_package_data = True,
|
tests_require=test_reqs,
|
||||||
|
test_suite="nose.collector",
|
||||||
|
include_package_data=True,
|
||||||
|
entry_points={
|
||||||
|
'console_scripts': [
|
||||||
|
'senpy = senpy.__main__:main'
|
||||||
|
]
|
||||||
|
}
|
||||||
)
|
)
|
||||||
|
3
test-requirements.txt
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
nose
|
||||||
|
mock
|
||||||
|
pbr
|
@@ -1,4 +1,3 @@
|
|||||||
|
|
||||||
import os
|
import os
|
||||||
import logging
|
import logging
|
||||||
|
|
||||||
@@ -10,6 +9,7 @@ from senpy.extensions import Senpy
|
|||||||
from flask import Flask
|
from flask import Flask
|
||||||
from flask.ext.testing import TestCase
|
from flask.ext.testing import TestCase
|
||||||
from gevent import sleep
|
from gevent import sleep
|
||||||
|
from itertools import product
|
||||||
|
|
||||||
|
|
||||||
def check_dict(indic, template):
|
def check_dict(indic, template):
|
||||||
@@ -17,6 +17,7 @@ def check_dict(indic, template):
|
|||||||
|
|
||||||
|
|
||||||
class BlueprintsTest(TestCase):
|
class BlueprintsTest(TestCase):
|
||||||
|
|
||||||
def create_app(self):
|
def create_app(self):
|
||||||
self.app = Flask("test_extensions")
|
self.app = Flask("test_extensions")
|
||||||
self.senpy = Senpy()
|
self.senpy = Senpy()
|
||||||
@@ -27,24 +28,31 @@ class BlueprintsTest(TestCase):
|
|||||||
return self.app
|
return self.app
|
||||||
|
|
||||||
def test_home(self):
|
def test_home(self):
|
||||||
""" Calling with no arguments should ask the user for more arguments """
|
"""
|
||||||
|
Calling with no arguments should ask the user for more arguments
|
||||||
|
"""
|
||||||
resp = self.client.get("/")
|
resp = self.client.get("/")
|
||||||
self.assert200(resp)
|
self.assert404(resp)
|
||||||
logging.debug(resp.json)
|
logging.debug(resp.json)
|
||||||
assert resp.json["status"] == "failed"
|
assert resp.json["status"] == 404
|
||||||
atleast = {
|
atleast = {
|
||||||
"status": "failed",
|
"status": 404,
|
||||||
"message": "Missing or invalid parameters",
|
"message": "Missing or invalid parameters",
|
||||||
}
|
}
|
||||||
assert check_dict(resp.json, atleast)
|
assert check_dict(resp.json, atleast)
|
||||||
|
|
||||||
def test_analysis(self):
|
def test_analysis(self):
|
||||||
""" The dummy plugin returns an empty response, it should contain the context """
|
"""
|
||||||
|
The dummy plugin returns an empty response,\
|
||||||
|
it should contain the context
|
||||||
|
"""
|
||||||
resp = self.client.get("/?i=My aloha mohame")
|
resp = self.client.get("/?i=My aloha mohame")
|
||||||
self.assert200(resp)
|
self.assert200(resp)
|
||||||
logging.debug(resp.json)
|
logging.debug("Got response: %s", resp.json)
|
||||||
assert "@context" in resp.json
|
assert "@context" in resp.json
|
||||||
assert check_dict(resp.json["@context"], {"marl": "http://www.gsi.dit.upm.es/ontologies/marl/ns#"})
|
assert check_dict(
|
||||||
|
resp.json["@context"],
|
||||||
|
{"marl": "http://www.gsi.dit.upm.es/ontologies/marl/ns#"})
|
||||||
assert "entries" in resp.json
|
assert "entries" in resp.json
|
||||||
|
|
||||||
def test_list(self):
|
def test_list(self):
|
||||||
@@ -53,6 +61,19 @@ class BlueprintsTest(TestCase):
|
|||||||
self.assert200(resp)
|
self.assert200(resp)
|
||||||
logging.debug(resp.json)
|
logging.debug(resp.json)
|
||||||
assert "Dummy" in resp.json
|
assert "Dummy" in resp.json
|
||||||
|
assert "@context" in resp.json
|
||||||
|
|
||||||
|
def test_headers(self):
|
||||||
|
for i, j in product(["/plugins/?nothing=", "/?i=test&"],
|
||||||
|
["headers", "inHeaders"]):
|
||||||
|
resp = self.client.get("%s" % (i))
|
||||||
|
assert "@context" in resp.json
|
||||||
|
resp = self.client.get("%s&%s=0" % (i, j))
|
||||||
|
assert "@context" in resp.json
|
||||||
|
resp = self.client.get("%s&%s=1" % (i, j))
|
||||||
|
assert "@context" not in resp.json
|
||||||
|
resp = self.client.get("%s&%s=true" % (i, j))
|
||||||
|
assert "@context" not in resp.json
|
||||||
|
|
||||||
def test_detail(self):
|
def test_detail(self):
|
||||||
""" Show only one plugin"""
|
""" Show only one plugin"""
|
||||||
@@ -78,3 +99,16 @@ class BlueprintsTest(TestCase):
|
|||||||
self.assert200(resp)
|
self.assert200(resp)
|
||||||
assert "is_activated" in resp.json
|
assert "is_activated" in resp.json
|
||||||
assert resp.json["is_activated"] == True
|
assert resp.json["is_activated"] == True
|
||||||
|
|
||||||
|
def test_default(self):
|
||||||
|
""" Show only one plugin"""
|
||||||
|
resp = self.client.get("/default")
|
||||||
|
self.assert200(resp)
|
||||||
|
logging.debug(resp.json)
|
||||||
|
assert "@id" in resp.json
|
||||||
|
assert resp.json["@id"] == "Dummy_0.1"
|
||||||
|
resp = self.client.get("/plugins/Dummy/deactivate")
|
||||||
|
self.assert200(resp)
|
||||||
|
sleep(0.5)
|
||||||
|
resp = self.client.get("/default")
|
||||||
|
self.assert404(resp)
|
||||||
|
40
tests/context.jsonld
Normal file
@@ -0,0 +1,40 @@
|
|||||||
|
{
|
||||||
|
"dc": "http://purl.org/dc/terms/",
|
||||||
|
"dc:subject": {
|
||||||
|
"@type": "@id"
|
||||||
|
},
|
||||||
|
"xsd": "http://www.w3.org/2001/XMLSchema#",
|
||||||
|
"marl": "http://www.gsi.dit.upm.es/ontologies/marl/ns#",
|
||||||
|
"nif": "http://persistence.uni-leipzig.org/nlp2rdf/ontologies/nif-core#",
|
||||||
|
"onyx": "http://www.gsi.dit.upm.es/ontologies/onyx/ns#",
|
||||||
|
"emotions": {
|
||||||
|
"@container": "@set",
|
||||||
|
"@id": "onyx:hasEmotionSet"
|
||||||
|
},
|
||||||
|
"opinions": {
|
||||||
|
"@container": "@set",
|
||||||
|
"@id": "marl:hasOpinion"
|
||||||
|
},
|
||||||
|
"prov": "http://www.w3.org/ns/prov#",
|
||||||
|
"rdfs": "http://www.w3.org/2000/01/rdf-schema#",
|
||||||
|
"analysis": {
|
||||||
|
"@container": "@set",
|
||||||
|
"@id": "prov:wasInformedBy"
|
||||||
|
},
|
||||||
|
"entries": {
|
||||||
|
"@container": "@set",
|
||||||
|
"@id": "prov:generated"
|
||||||
|
},
|
||||||
|
"strings": {
|
||||||
|
"@container": "@set",
|
||||||
|
"@reverse": "nif:hasContext"
|
||||||
|
},
|
||||||
|
"date":
|
||||||
|
{
|
||||||
|
"@id": "dc:date",
|
||||||
|
"@type": "xsd:dateTime"
|
||||||
|
},
|
||||||
|
"text": { "@id": "nif:isString" },
|
||||||
|
"wnaffect": "http://www.gsi.dit.upm.es/ontologies/wnaffect#",
|
||||||
|
"xsd": "http://www.w3.org/2001/XMLSchema#"
|
||||||
|
}
|
@@ -1,8 +1,8 @@
|
|||||||
from senpy.plugins import SentimentPlugin
|
from senpy.plugins import SentimentPlugin
|
||||||
from senpy.models import Response
|
from senpy.models import Response
|
||||||
|
|
||||||
|
|
||||||
class DummyPlugin(SentimentPlugin):
|
class DummyPlugin(SentimentPlugin):
|
||||||
pass
|
|
||||||
|
|
||||||
def analyse(self, *args, **kwargs):
|
def analyse(self, *args, **kwargs):
|
||||||
return Response()
|
return Response()
|
||||||
|
@@ -11,10 +11,11 @@ from flask.ext.testing import TestCase
|
|||||||
|
|
||||||
|
|
||||||
class ExtensionsTest(TestCase):
|
class ExtensionsTest(TestCase):
|
||||||
|
|
||||||
def create_app(self):
|
def create_app(self):
|
||||||
self.app = Flask("test_extensions")
|
self.app = Flask("test_extensions")
|
||||||
self.dir = os.path.join(os.path.dirname(__file__), "..")
|
self.dir = os.path.join(os.path.dirname(__file__), "..")
|
||||||
self.senpy = Senpy(plugin_folder=self.dir)
|
self.senpy = Senpy(plugin_folder=self.dir, default_plugins=False)
|
||||||
self.senpy.init_app(self.app)
|
self.senpy.init_app(self.app)
|
||||||
self.senpy.activate_plugin("Dummy", sync=True)
|
self.senpy.activate_plugin("Dummy", sync=True)
|
||||||
return self.app
|
return self.app
|
||||||
@@ -42,26 +43,39 @@ class ExtensionsTest(TestCase):
|
|||||||
def test_disabling(self):
|
def test_disabling(self):
|
||||||
""" Disabling a plugin """
|
""" Disabling a plugin """
|
||||||
self.senpy.deactivate_all(sync=True)
|
self.senpy.deactivate_all(sync=True)
|
||||||
assert self.senpy.plugins["Dummy"].is_activated == False
|
assert not self.senpy.plugins["Dummy"].is_activated
|
||||||
assert self.senpy.plugins["Sleep"].is_activated == False
|
assert not self.senpy.plugins["Sleep"].is_activated
|
||||||
|
|
||||||
def test_default(self):
|
def test_default(self):
|
||||||
""" Default plugin should be set """
|
""" Default plugin should be set """
|
||||||
assert self.senpy.default_plugin
|
assert self.senpy.default_plugin
|
||||||
assert self.senpy.default_plugin == "Dummy"
|
assert self.senpy.default_plugin.name == "Dummy"
|
||||||
|
self.senpy.deactivate_all(sync=True)
|
||||||
|
logging.debug("Default: {}".format(self.senpy.default_plugin))
|
||||||
|
assert self.senpy.default_plugin is None
|
||||||
|
|
||||||
|
def test_noplugin(self):
|
||||||
|
""" Don't analyse if there isn't any plugin installed """
|
||||||
|
self.senpy.deactivate_all(sync=True)
|
||||||
|
resp = self.senpy.analyse(input="tupni")
|
||||||
|
logging.debug("Response: {}".format(resp))
|
||||||
|
assert resp["status"] == 404
|
||||||
|
|
||||||
def test_analyse(self):
|
def test_analyse(self):
|
||||||
""" Using a plugin """
|
""" Using a plugin """
|
||||||
with mock.patch.object(self.senpy.plugins["Dummy"], "analyse") as mocked:
|
# I was using mock until plugin started inheriting
|
||||||
self.senpy.analyse(algorithm="Dummy", input="tupni", output="tuptuo")
|
# Leaf (defaultdict with __setattr__ and __getattr__.
|
||||||
self.senpy.analyse(input="tupni", output="tuptuo")
|
r1 = self.senpy.analyse(
|
||||||
mocked.assert_any_call(input="tupni", output="tuptuo", algorithm="Dummy")
|
algorithm="Dummy", input="tupni", output="tuptuo")
|
||||||
mocked.assert_any_call(input="tupni", output="tuptuo")
|
r2 = self.senpy.analyse(input="tupni", output="tuptuo")
|
||||||
|
assert r1.analysis[0].id[:5] == "Dummy"
|
||||||
|
assert r2.analysis[0].id[:5] == "Dummy"
|
||||||
for plug in self.senpy.plugins:
|
for plug in self.senpy.plugins:
|
||||||
self.senpy.deactivate_plugin(plug, sync=True)
|
self.senpy.deactivate_plugin(plug, sync=True)
|
||||||
resp = self.senpy.analyse(input="tupni")
|
resp = self.senpy.analyse(input="tupni")
|
||||||
logging.debug("Response: {}".format(resp))
|
logging.debug("Response: {}".format(resp))
|
||||||
assert resp["status"] == 400
|
assert resp["status"] == 404
|
||||||
|
|
||||||
|
|
||||||
def test_filtering(self):
|
def test_filtering(self):
|
||||||
""" Filtering plugins """
|
""" Filtering plugins """
|
||||||
@@ -69,4 +83,5 @@ class ExtensionsTest(TestCase):
|
|||||||
assert not len(self.senpy.filter_plugins(name="notdummy"))
|
assert not len(self.senpy.filter_plugins(name="notdummy"))
|
||||||
assert self.senpy.filter_plugins(name="Dummy", is_activated=True)
|
assert self.senpy.filter_plugins(name="Dummy", is_activated=True)
|
||||||
self.senpy.deactivate_plugin("Dummy", sync=True)
|
self.senpy.deactivate_plugin("Dummy", sync=True)
|
||||||
assert not len(self.senpy.filter_plugins(name="Dummy", is_activated=True))
|
assert not len(
|
||||||
|
self.senpy.filter_plugins(name="Dummy", is_activated=True))
|
||||||
|
81
tests/models_test/__init__.py
Normal file
@@ -0,0 +1,81 @@
|
|||||||
|
import os
|
||||||
|
import logging
|
||||||
|
|
||||||
|
try:
|
||||||
|
import unittest.mock as mock
|
||||||
|
except ImportError:
|
||||||
|
import mock
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
from unittest import TestCase
|
||||||
|
from senpy.models import Response, Entry
|
||||||
|
from senpy.plugins import SenpyPlugin
|
||||||
|
|
||||||
|
|
||||||
|
class ModelsTest(TestCase):
|
||||||
|
|
||||||
|
def test_response(self):
|
||||||
|
r = Response(context=os.path.normpath(
|
||||||
|
os.path.join(__file__, "..", "..", "context.jsonld")))
|
||||||
|
assert("@context" in r)
|
||||||
|
assert(r._frame)
|
||||||
|
logging.debug("Default frame: %s", r._frame)
|
||||||
|
assert("marl" in r.context)
|
||||||
|
assert("entries" in r.context)
|
||||||
|
|
||||||
|
r2 = Response(context=json.loads('{"test": "roger"}'))
|
||||||
|
assert("test" in r2.context)
|
||||||
|
|
||||||
|
r3 = Response(context=None)
|
||||||
|
del r3.context
|
||||||
|
assert("@context" not in r3)
|
||||||
|
assert("entries" in r3)
|
||||||
|
assert("analysis" in r3)
|
||||||
|
|
||||||
|
r4 = Response()
|
||||||
|
assert("@context" in r4)
|
||||||
|
assert("entries" in r4)
|
||||||
|
assert("analysis" in r4)
|
||||||
|
|
||||||
|
dummy = SenpyPlugin({"name": "dummy", "version": 0})
|
||||||
|
r5 = Response({"dummy": dummy}, context=None, frame=None)
|
||||||
|
logging.debug("Response 5: %s", r5)
|
||||||
|
assert("dummy" in r5)
|
||||||
|
assert(r5["dummy"].name == "dummy")
|
||||||
|
js = r5.jsonld(context={}, frame={})
|
||||||
|
logging.debug("jsonld 5: %s", js)
|
||||||
|
assert("dummy" in js)
|
||||||
|
assert(js["dummy"].name == "dummy")
|
||||||
|
|
||||||
|
r6 = Response()
|
||||||
|
r6.entries.append(Entry(text="Just testing"))
|
||||||
|
logging.debug("Reponse 6: %s", r6)
|
||||||
|
assert("@context" in r6)
|
||||||
|
assert("marl" in r6.context)
|
||||||
|
assert("entries" in r6.context)
|
||||||
|
js = r6.jsonld()
|
||||||
|
logging.debug("jsonld: %s", js)
|
||||||
|
assert("entries" in js)
|
||||||
|
assert("entries" in js)
|
||||||
|
assert("analysis" in js)
|
||||||
|
resp = r6.flask()
|
||||||
|
received = json.loads(resp.data)
|
||||||
|
logging.debug("Response: %s", js)
|
||||||
|
assert(received["entries"])
|
||||||
|
assert(received["entries"][0]["text"] == "Just testing")
|
||||||
|
assert(received["entries"][0]["text"] != "Not testing")
|
||||||
|
|
||||||
|
def test_opinions(self):
|
||||||
|
pass
|
||||||
|
|
||||||
|
def test_plugins(self):
|
||||||
|
p = SenpyPlugin({"name": "dummy", "version": 0})
|
||||||
|
c = p.jsonld()
|
||||||
|
assert "info" not in c
|
||||||
|
assert "repo" not in c
|
||||||
|
assert "params" not in c
|
||||||
|
logging.debug("Framed: %s", c)
|
||||||
|
assert "extra_params" in c
|
||||||
|
|
||||||
|
def test_frame_response(self):
|
||||||
|
pass
|
70
tests/plugins_test/__init__.py
Normal file
@@ -0,0 +1,70 @@
|
|||||||
|
#!/bin/env python2
|
||||||
|
# -*- py-which-shell: "python2"; -*-
|
||||||
|
import os
|
||||||
|
import logging
|
||||||
|
import shelve
|
||||||
|
|
||||||
|
try:
|
||||||
|
import unittest.mock as mock
|
||||||
|
except ImportError:
|
||||||
|
import mock
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
from unittest import TestCase
|
||||||
|
from senpy.models import Response, Entry
|
||||||
|
from senpy.plugins import SenpyPlugin, ShelfMixin
|
||||||
|
|
||||||
|
|
||||||
|
class ShelfTest(ShelfMixin, SenpyPlugin):
|
||||||
|
|
||||||
|
def test(self, key=None, value=None):
|
||||||
|
assert isinstance(self.sh, shelve.Shelf)
|
||||||
|
assert key in self.sh
|
||||||
|
print('Checking: sh[{}] == {}'.format(key, value))
|
||||||
|
print('SH[{}]: {}'.format(key, self.sh[key]))
|
||||||
|
assert self.sh[key] == value
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
class ModelsTest(TestCase):
|
||||||
|
shelf_file = 'shelf_test.db'
|
||||||
|
|
||||||
|
|
||||||
|
def tearDown(self):
|
||||||
|
if os.path.isfile(self.shelf_file):
|
||||||
|
os.remove(self.shelf_file)
|
||||||
|
|
||||||
|
setUp = tearDown
|
||||||
|
|
||||||
|
def test_shelf(self):
|
||||||
|
''' A shelf is created and the value is stored '''
|
||||||
|
a = ShelfTest(info={'name': 'shelve',
|
||||||
|
'version': 'test',
|
||||||
|
'shelf_file': self.shelf_file})
|
||||||
|
assert a.sh == {}
|
||||||
|
assert a.shelf_file == self.shelf_file
|
||||||
|
|
||||||
|
a.sh['a'] = 'fromA'
|
||||||
|
|
||||||
|
a.test(key='a', value='fromA')
|
||||||
|
del(a)
|
||||||
|
assert os.path.isfile(self.shelf_file)
|
||||||
|
sh = shelve.open(self.shelf_file)
|
||||||
|
assert sh['a'] == 'fromA'
|
||||||
|
|
||||||
|
|
||||||
|
def test_two(self):
|
||||||
|
''' Reusing the values of a previous shelf '''
|
||||||
|
a = ShelfTest(info={'name': 'shelve',
|
||||||
|
'version': 'test',
|
||||||
|
'shelf_file': self.shelf_file})
|
||||||
|
print('Shelf file: %s' % a.shelf_file)
|
||||||
|
a.sh['a'] = 'fromA'
|
||||||
|
a.close()
|
||||||
|
|
||||||
|
b = ShelfTest(info={'name': 'shelve',
|
||||||
|
'version': 'test',
|
||||||
|
'shelf_file': self.shelf_file})
|
||||||
|
b.test(key='a', value='fromA')
|
||||||
|
b.sh['a'] = 'fromB'
|
||||||
|
assert b.sh['a'] == 'fromB'
|
@@ -1,10 +1,17 @@
|
|||||||
from senpy.plugins import SenpyPlugin
|
from senpy.plugins import SenpyPlugin
|
||||||
|
from senpy.models import Response
|
||||||
from time import sleep
|
from time import sleep
|
||||||
|
|
||||||
|
|
||||||
class SleepPlugin(SenpyPlugin):
|
class SleepPlugin(SenpyPlugin):
|
||||||
|
|
||||||
def __init__(self, info, *args, **kwargs):
|
def __init__(self, info, *args, **kwargs):
|
||||||
super(SleepPlugin, self).__init__(info, *args, **kwargs)
|
super(SleepPlugin, self).__init__(info, *args, **kwargs)
|
||||||
self.timeout = int(info["timeout"])
|
self.timeout = int(info["timeout"])
|
||||||
|
|
||||||
def activate(self, *args, **kwargs):
|
def activate(self, *args, **kwargs):
|
||||||
sleep(self.timeout)
|
sleep(self.timeout)
|
||||||
|
|
||||||
|
def analyse(self, *args, **kwargs):
|
||||||
|
sleep(float(kwargs.get("timeout", self.timeout)))
|
||||||
|
return Response()
|
||||||
|
@@ -4,5 +4,13 @@
|
|||||||
"description": "I am dummy",
|
"description": "I am dummy",
|
||||||
"author": "@balkian",
|
"author": "@balkian",
|
||||||
"version": "0.1",
|
"version": "0.1",
|
||||||
"timeout": "2"
|
"timeout": "2",
|
||||||
|
"extra_params": {
|
||||||
|
"timeout": {
|
||||||
|
"@id": "timeout_sleep",
|
||||||
|
"aliases": ["timeout", "to"],
|
||||||
|
"required": false,
|
||||||
|
"default": 0
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|