1
0
mirror of https://github.com/gsi-upm/senpy synced 2024-11-24 17:12:29 +00:00

Removed nbsphinx

It requires pandoc, which cannot be installed with pip.

We can either link to the nbfile or convert the file
manually/automatically:

```
nbconvert SenpyClientUse.ipynb --to rst
```
This commit is contained in:
J. Fernando Sánchez 2017-04-10 20:04:36 +02:00
parent 955e17eb2a
commit 188c33332a
7 changed files with 246 additions and 165 deletions

View File

@ -2,14 +2,35 @@
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"metadata": {
"ExecuteTime": {
"end_time": "2017-04-10T17:05:31.465571Z",
"start_time": "2017-04-10T19:05:31.458282+02:00"
},
"deletable": true,
"editable": true
},
"source": [
"# Client"
]
},
{
"cell_type": "markdown",
"metadata": {},
"metadata": {
"collapsed": true,
"deletable": true,
"editable": true
},
"source": [
"The built-in senpy client allows you to query any Senpy endpoint. We will illustrate how to use it with the public demo endpoint, and then show you how to spin up your own endpoint using docker."
]
},
{
"cell_type": "markdown",
"metadata": {
"deletable": true,
"editable": true
},
"source": [
"Demo Endpoint\n",
"-------------"
@ -17,47 +38,88 @@
},
{
"cell_type": "markdown",
"metadata": {},
"metadata": {
"deletable": true,
"editable": true
},
"source": [
"Import Client and send a request"
"To start using senpy, simply create a new Client and point it to your endpoint. In this case, the latest version of Senpy at GSI."
]
},
{
"cell_type": "code",
"execution_count": 7,
"execution_count": null,
"metadata": {
"collapsed": false
"ExecuteTime": {
"end_time": "2017-04-10T17:29:12.827640Z",
"start_time": "2017-04-10T19:29:12.818617+02:00"
},
"collapsed": false,
"deletable": true,
"editable": true
},
"outputs": [],
"source": [
"from senpy.client import Client\n",
"\n",
"c = Client('http://latest.senpy.cluster.gsi.dit.upm.es/api')\n",
"r = c.analyse('I like Pizza', algorithm='sentiment140')"
"c = Client('http://latest.senpy.cluster.gsi.dit.upm.es/api')\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"metadata": {
"deletable": true,
"editable": true
},
"source": [
"Print response"
"Now, let's use that client analyse some queries:"
]
},
{
"cell_type": "code",
"execution_count": 8,
"execution_count": null,
"metadata": {
"collapsed": false
"ExecuteTime": {
"end_time": "2017-04-10T17:29:14.011657Z",
"start_time": "2017-04-10T19:29:13.701808+02:00"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"I like Pizza -> marl:Positive\n"
"collapsed": false,
"deletable": true,
"editable": true
},
"outputs": [],
"source": [
"r = c.analyse('I like sugar!!', algorithm='sentiment140')\n",
"r"
]
}
],
},
{
"cell_type": "markdown",
"metadata": {
"ExecuteTime": {
"end_time": "2017-04-10T17:08:19.616754Z",
"start_time": "2017-04-10T19:08:19.610767+02:00"
},
"deletable": true,
"editable": true
},
"source": [
"As you can see, that gave us the full JSON result. A more concise way to print it would be:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"ExecuteTime": {
"end_time": "2017-04-10T17:29:14.854213Z",
"start_time": "2017-04-10T19:29:14.842068+02:00"
},
"collapsed": false,
"deletable": true,
"editable": true
},
"outputs": [],
"source": [
"for entry in r.entries:\n",
" print('{} -> {}'.format(entry['text'], entry['sentiments'][0]['marl:hasPolarity']))"
@ -65,36 +127,64 @@
},
{
"cell_type": "markdown",
"metadata": {},
"metadata": {
"deletable": true,
"editable": true
},
"source": [
"Obtain a list of available plugins "
"We can also obtain a list of available plugins with the client:"
]
},
{
"cell_type": "code",
"execution_count": 9,
"execution_count": null,
"metadata": {
"collapsed": false
"ExecuteTime": {
"end_time": "2017-04-10T17:29:16.245198Z",
"start_time": "2017-04-10T19:29:16.056545+02:00"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"emoRand\n",
"rand\n",
"sentiment140\n"
]
}
],
"collapsed": false,
"deletable": true,
"editable": true
},
"outputs": [],
"source": [
"for plugin in c.request('/plugins')['plugins']:\n",
" print(plugin['name'])"
"c.plugins()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"metadata": {
"deletable": true,
"editable": true
},
"source": [
"Or, more concisely:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"ExecuteTime": {
"end_time": "2017-04-10T17:29:17.663275Z",
"start_time": "2017-04-10T19:29:17.484623+02:00"
},
"collapsed": false,
"deletable": true,
"editable": true
},
"outputs": [],
"source": [
"c.plugins().keys()"
]
},
{
"cell_type": "markdown",
"metadata": {
"deletable": true,
"editable": true
},
"source": [
"Local Endpoint\n",
"--------------"
@ -102,128 +192,81 @@
},
{
"cell_type": "markdown",
"metadata": {},
"metadata": {
"deletable": true,
"editable": true
},
"source": [
"Run a docker container with Senpy image and default plugins"
"To run your own instance of senpy, just create a docker container with the latest Senpy image. Using `--default-plugins` you will get some extra plugins to start playing with the API."
]
},
{
"cell_type": "code",
"execution_count": 12,
"execution_count": null,
"metadata": {
"collapsed": false
"ExecuteTime": {
"end_time": "2017-04-10T17:29:20.637539Z",
"start_time": "2017-04-10T19:29:19.938322+02:00"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"a0157cd98057072388bfebeed78a830da7cf0a796f4f1a3fd9188f9f2e5fe562\r\n"
]
}
],
"source": [
"!docker run -ti --name 'SenpyEndpoint' -d -p 5000:5000 gsiupm/senpy:0.8.6 --host 0.0.0.0 --default-plugins"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Import client and send a request to localhost"
]
},
{
"cell_type": "code",
"execution_count": 13,
"metadata": {
"collapsed": false
"collapsed": false,
"deletable": true,
"editable": true
},
"outputs": [],
"source": [
"c_local = Client('http://127.0.0.1:5000/api')\n",
"r = c_local.analyse('Hello world', algorithm='sentiment140')"
"!docker run -ti --name 'SenpyEndpoint' -d -p 6000:5000 gsiupm/senpy:0.8.6 --host 0.0.0.0 --default-plugins"
]
},
{
"cell_type": "markdown",
"metadata": {},
"metadata": {
"deletable": true,
"editable": true
},
"source": [
"Print response"
"To use this endpoint:"
]
},
{
"cell_type": "code",
"execution_count": 14,
"execution_count": null,
"metadata": {
"collapsed": false
"ExecuteTime": {
"end_time": "2017-04-10T17:29:21.263976Z",
"start_time": "2017-04-10T19:29:21.260595+02:00"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Hello world -> marl:Neutral\n"
]
}
],
"collapsed": false,
"deletable": true,
"editable": true
},
"outputs": [],
"source": [
"for entry in r.entries:\n",
" print('{} -> {}'.format(entry['text'], entry['sentiments'][0]['marl:hasPolarity']))"
"c_local = Client('http://127.0.0.1:6000/api')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"metadata": {
"deletable": true,
"editable": true
},
"source": [
"Obtain a list of available plugins deployed locally"
"That's all! After you are done with your analysis, stop the docker container:"
]
},
{
"cell_type": "code",
"execution_count": 15,
"execution_count": null,
"metadata": {
"collapsed": false
"ExecuteTime": {
"end_time": "2017-04-10T17:29:33.226686Z",
"start_time": "2017-04-10T19:29:22.392121+02:00"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"rand\n",
"sentiment140\n",
"emoRand\n"
]
}
],
"source": [
"for plugin in c_local.request('/plugins')['plugins']:\n",
" print(plugin['name'])"
]
"collapsed": false,
"deletable": true,
"editable": true
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Stop the docker container"
]
},
{
"cell_type": "code",
"execution_count": 11,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"SenpyEndpoint\n",
"SenpyEndpoint\n"
]
}
],
"outputs": [],
"source": [
"!docker stop SenpyEndpoint\n",
"!docker rm SenpyEndpoint"
@ -233,7 +276,7 @@
"metadata": {
"anaconda-cloud": {},
"kernelspec": {
"display_name": "Python [default]",
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
@ -247,7 +290,26 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.5.2"
"version": "3.6.0"
},
"toc": {
"colors": {
"hover_highlight": "#DAA520",
"running_highlight": "#FF0000",
"selected_highlight": "#FFD700"
},
"moveMenuLeft": true,
"nav_menu": {
"height": "68px",
"width": "252px"
},
"navigate_menu": true,
"number_sections": true,
"sideBar": true,
"threshold": 4,
"toc_cell": false,
"toc_section_display": "block",
"toc_window_display": false
}
},
"nbformat": 4,

View File

@ -1,10 +0,0 @@
Client demo
===========
This video shows how to use senpy through command-line tool.
.. image:: https://asciinema.org/a/9uwef1ghkjk062cw2t4mhzpyk.png
:width: 100%
:target: https://asciinema.org/a/9uwef1ghkjk062cw2t4mhzpyk
:alt: CLI demo

View File

@ -1,24 +1,9 @@
Command line
============
In case you want to load modules, which are located in different folders under the root folder, use the next option.
This video shows how to analyse text directly on the command line using the senpy tool.
.. code:: bash
senpy -f .
The default port used by senpy is 5000, but you can change it using the `--port` flag.
.. code:: bash
senpy --port 8080
Also, the host can be changed where senpy is deployed. The default value is `127.0.0.1`.
.. code:: bash
senpy --host 0.0.0.0
For more options, see the `--help` page.
Alternatively, you can use the modules included in senpy to build your own application.
.. image:: https://asciinema.org/a/9uwef1ghkjk062cw2t4mhzpyk.png
:width: 100%
:target: https://asciinema.org/a/9uwef1ghkjk062cw2t4mhzpyk
:alt: CLI demo

View File

@ -38,7 +38,6 @@ extensions = [
'sphinxcontrib.httpdomain',
'sphinx.ext.coverage',
'sphinx.ext.autosectionlabel',
'nbsphinx'
]
# Add any paths that contain templates here, relative to this directory.

View File

@ -1,11 +1,17 @@
Welcome to Senpy's documentation!
=================================
.. image:: https://travis-ci.org/gsi-upm/senpy.svg?branch=master
:target: https://travis-ci.org/gsi-upm/senpy
.. image:: https://readthedocs.org/projects/senpy/badge/?version=latest
:target: http://senpy.readthedocs.io/en/latest/
.. image:: https://badge.fury.io/py/senpy.svg
:target: https://badge.fury.io/py/senpy
.. image:: https://lab.cluster.gsi.dit.upm.es/senpy/senpy/badges/master/build.svg
:target: https://lab.cluster.gsi.dit.upm.es/senpy/senpy/commits/master
.. image:: https://lab.cluster.gsi.dit.upm.es/senpy/senpy/badges/master/coverage.svg
:target: https://lab.cluster.gsi.dit.upm.es/senpy/senpy/commits/master
.. image:: https://img.shields.io/pypi/l/requests.svg
:target: https://lab.cluster.gsi.dit.upm.es/senpy/senpy/
With Senpy, you can easily turn your sentiment or emotion analysis algorithm into a full blown semantic service.
Sharing your sentiment analysis with the world has never been easier.

View File

@ -1,8 +1,48 @@
Senpy server
============
Once the server is launched, there is a basic endpoint in the server, which provides a playground to use the plugins that have been loaded.
The senpy server is launched via the `senpy` command:
In case you want to know the different endpoints of the server, there is more information available in the NIF API section_.
::
usage: senpy [-h] [--level logging_level] [--debug] [--default-plugins]
[--host HOST] [--port PORT] [--plugins-folder PLUGINS_FOLDER]
[--only-install]
Run a Senpy server
optional arguments:
-h, --help show this help message and exit
--level logging_level, -l logging_level
Logging level
--debug, -d Run the application in debug mode
--default-plugins Load the default plugins
--host HOST Use 0.0.0.0 to accept requests from any host.
--port PORT, -p PORT Port to listen on.
--plugins-folder PLUGINS_FOLDER, -f PLUGINS_FOLDER
Where to look for plugins.
--only-install, -i Do not run a server, only install plugin dependencies
When launched, the server will recursively look for plugins in the specified plugins folder (the current working directory by default).
The default server includes a playground and an endpoint with all plugins found.
By default, senpy will listen only on the `127.0.0.1` address.
That means you can only access the API from your (or localhost).
You can listen on a different address using the `--host` flag.
The default port is 5000.
You can change bothbut you can change it using the `--port` flag.
For instance, to accept connections on port 6000 on any interface:
.. code:: bash
senpy --host 0.0.0.0 --port 6000
For more options, see the `--help` page.
Customizing senpy
=================
Senpy is built on top of Flask, the web framework.
Although it is not the recommendad way, you may customize senpy by extending the extensions, blueprints and modules provided in the senpy module.

View File

@ -9,7 +9,6 @@ Once installed, the `senpy` command should be available.
SenpyClientUse
server
clidemo
commandline