Welcome to issho’s documentation!¶
issho¶
issho
and intuitive wrapper over paramiko for configuring
and talking to a remote host. keyring is used to
manage secrets locally.
issho
is designed such that interacting with a
single, heavily used remote machine should
be easy, and working with more than one remote
machine should be simple.
Free software: MIT license
Documentation: https://issho.readthedocs.io.
Installation¶
Install with pip
or conda
. IMPORTANT NOTE: For python 3.5, install via pip
only; several issho
dependencies have not been updated on conda-forge
.
pip install issho
conda install -c conda-forge issho
Features¶
- Simple access to simple commands
Port forwarding
Executing commands over ssh
Transferring files over sftp
Running a hive query
Running a spark job
Credits¶
This package was created with Cookiecutter and the audreyr/cookiecutter-pypackage project template.
The sftp work and (future)testing framework is adapted from Jeff Hinrichs’s excellent pysftp package, and some of the ssh framework is inspired by Colin Wood’s sshed.
Shout out to Spencer Tipping, Neal Fultz, and Factual for helping me learn to write my own tools.
Thanks to Michael Vertuli and Fangshu Lin for helping test.
Installation¶
Stable release¶
issho
can be installed from either pip
or conda
$ pip install issho
$ conda install -c conda-forge issho
From sources¶
The sources for issho can be downloaded from the Github repo.
You can either clone the public repository:
$ git clone git://github.com/michaelbilow/issho
Or download the tarball:
$ curl -OL https://github.com/michaelbilow/issho/tarball/master
Once you have a copy of the source, you can install it with:
$ python setup.py install
Setup¶
After installing issho
, you will want to do some setup.
First, add the machine you want a profile for to your
.ssh/config
. For example, if you want to add a machine
with the alias dev
(the default for issho
),
you would add the following lines to your ssh config.
Host dev
HostName your-host-name.com
Port XXXXX
User your_user
Once this is set up, you can set up passwords and common variables using the following command:
issho config dev
This command will drop you into an interactive prompt where you can enter passwords and configuration variables.
Usage¶
Basic Commands¶
To use issho
in a project:
from issho import Issho
The first thing to do:
devbox = Issho('dev')
This will set up a connection to the machine referred to as dev
in your
.ssh/config
. Note that this will only work if Issho has already been
configured.
To run a command on your devbox, you can do the following:
devbox.exec('echo "Hello, world!"')
'Hello, world!'
Note that the data is printed, not returned.
You can copy a file to or from your remote using put
& get
:
output_filename = 'test.txt'
copy_back_filename = 'get_test.txt'
with open(output_filename, 'w') as f:
f.write('\n'.join(map(str, range(5))))
devbox.put(output_filename)
devbox.exec('cat {}'.format(output_filename))
devbox.get(output_filname, copy_back_filename)
for line in open(copy_back_filename):
print(line.strip())
Convenience Functions¶
Shell Commands¶
Instead of using devbox.exec(cmd, *args)
, you can write devbox.cmd(*args)
:
devbox.touch('my_test.txt')
devbox.ls(' | grep my_test.txt')
devbox.rm('my_test.txt')
Underscores in the function name are converted to spaces:
devbox.seq_5()
Hadoop & HDFS¶
Hadoop functions can be accessed using the .hadoop
or .hdfs
methods.
You do not need to prepend the dash to hadoop operations, though they will
still work with it:
devbox.hdfs('ls /tmp | grep test')
devbox.hadoop('mkdir -p /tmp/test/')
put
and get
can also get from HDFS, if passed a qualified
HDFS path, or if the hadoop option is passed.:
devbox.put('test.txt', '/tmp/my_folder/', hadoop=True)
devbox.get('hdfs:///tmp/myfile')
Hive¶
issho
offers several convenience functions, including this for Hive:
devbox.hive('select * from burgers limit 10;')
devbox.hive('burger_query.sql')
Results from hive queries can be output locally by passing an output_filename:
devbox.hive('select stack(3, "hello", "cruel", "world") as val;', "hello.tsv")
Spark¶
issho
can trigger a spark job using spark-submit
; you can call it using
`spark_submit
or spark
:
devbox.spark(application='test.jar', application_class='com.test.SparkWorkflow'...)
issho¶
issho package¶
Submodules¶
issho.cli module¶
-
class
issho.cli.
IsshoCLI
[source]¶ Bases:
object
CLI for Issho; right now only used for configuration
-
config
(profile, env=None, ssh_profile='', ssh_config='~/.ssh/config', rsa_id='~/.ssh/id_rsa')[source]¶ Configures a single issho profile. Saves non-private variables to
~/.issho/conf.toml
and passwords to the local keyring.- Parameters
profile – name of the profile to configure
env – Optional environment variable profile to draw from.
ssh_profile – The name of the associated ssh config profile; defaults to the profile name if not supplied.
ssh_config – the path to the ssh_config to be used for this profile
rsa_id – the path to the id_rsa file to be used for this profile
-
static
env
(env_name)[source]¶ Saves a set of variables to ~/.issho/envs.toml :param env_name: name of the environment to set up or update
-
issho.config module¶
-
issho.config.
read_issho_conf
(profile, filename=PosixPath('/home/docs/.issho/conf.toml'))[source]¶ Writes issho variables out to a
.toml
file.- Parameters
profile – The name of the profile to read
filename – The output filename
- Returns
a dict of data stored with that profile in the configuration file
-
issho.config.
read_issho_env
(profile)[source]¶ Reads issho environment variables to a dict :param profile: the name of the issho environment to draw from :return: a dict of data with that profile stored in the environment file
-
issho.config.
read_ssh_profile
(ssh_config_path, profile)[source]¶ Helper method for getting data from .ssh/config
issho.helpers module¶
-
issho.helpers.
able_to_connect
(host, port, timeout=1.5)[source]¶ Returns true if it is possible to connect to the specified host and port, within the given timeout in seconds.
-
issho.helpers.
absolute_path
(raw_path)[source]¶ Gets the string absolute path from a path object or string.
- Parameters
raw_path – a string or
pathlib.Path
object
issho.issho module¶
Implementation for the Issho
class, which implements
a connection and some simple commands over ssh
, using
keyring
to manage secrets locally.
-
class
issho.issho.
Issho
(profile='dev', kinit=True)[source]¶ Bases:
object
-
exec
(cmd, *args, bg=False, debug=False, capture_output=False)[source]¶ Execute a command in bash over the SSH connection.
Note, this command does not use an interactive terminal; it instead uses a non-interactive login shell. This means (specifically) that your aliased commands will not work and only variables exported in your remote
.bashrc
will be available.- Parameters
cmd – The bash command to be run remotely
*args –
Additional arguments to the command cmd
bg – True = run in the background
debug – True = print some debugging output
capture_output – True = return stdout as a string
- Returns
-
get
(remotepath, localpath=None, hadoop=False)[source]¶ Gets the file at the remote path and puts it locally.
- Parameters
remotepath – The path on the remote from which to get.
localpath – Defaults to the name of the remote path
hadoop – Download from HDFS
-
hadoop
(command, *args, **kwargs)[source]¶ Execute the hadoop command :param command: :param args: :param kwargs: :return:
-
hive
(query, output_filename=None, remove_blank_top_line=True)[source]¶ Runs a hive query using the parameters set in .issho/config.toml
- Parameters
query – a string query, or the name of a query file name to run.
output_filename – the (local) file to output the results of the hive query to. Adding this option will also keep a copy of the results in /tmp
remove_blank_top_line – Hive usually has a blank top line when data is output, this parameter removes it.
-
local_forward
(remote_host, remote_port, local_host='0.0.0.0', local_port=44556)[source]¶ Forwards a port from a remote through this Issho object. Useful for connecting to remote hosts that can only be accessed from inside a VPC of which your devbox is part.
-
put
(localpath, remotepath=None, hadoop=False)[source]¶ Puts the file at the local path to the remote.
- Parameters
localpath – The local path of the file to put to the remote
remotepath – Defaults to the name of the local path
hadoop – Upload to HDFS
-
spark_submit
(spark_options=None, master='', jars='', files='', driver_class_path='', application_class='', application='', application_args='')[source]¶ Submit a spark job.
- Parameters
spark_options – A dict of spark options
master – syntactic sugar for the –master spark option
jars – syntactic sugar for the –jars spark option
files – syntactic sugar for the –files spark option
driver_class_path – syntactic sugar for the –driver-class-path spark option
application_class – syntactic sugar for the –class spark option
application – the application to submit
application_args – any arguments to be passed to the spark application
- Returns
-
Module contents¶
issho - simple connections to remote machines¶
issho is a Python package providing a simple wrapper over paramiko, providing = operators interacting with remote machines
Main Features¶
- Here are a few of the things that issho (should) do well:
execute commands on a remote box
transfer files to and from a remote easily
set up an SSH tunnel through a remote
run Hive & Spark jobs
TODOs¶
make it easy to interact with hadoop
make it easy to configure new services
make it easy to add plugins to issho
Contributing¶
Contributions are welcome, and they are greatly appreciated! Every little bit helps, and credit will always be given.
You can contribute in many ways:
Types of Contributions¶
Report Bugs¶
Report bugs at https://github.com/michaelbilow/issho/issues.
If you are reporting a bug, please include:
Your operating system name and version.
Any details about your local setup that might be helpful in troubleshooting.
Detailed steps to reproduce the bug.
Fix Bugs¶
Look through the GitHub issues for bugs. Anything tagged with “bug” and “help wanted” is open to whoever wants to implement it.
Implement Features¶
Look through the GitHub issues for features. Anything tagged with “enhancement” and “help wanted” is open to whoever wants to implement it.
Write Documentation¶
issho could always use more documentation, whether as part of the official issho docs, in docstrings, or even on the web in blog posts, articles, and such.
Submit Feedback¶
The best way to send feedback is to file an issue at https://github.com/michaelbilow/issho/issues.
If you are proposing a feature:
Explain in detail how it would work.
Keep the scope as narrow as possible, to make it easier to implement.
Remember that this is a volunteer-driven project, and that contributions are welcome :)
Get Started!¶
Ready to contribute? Here’s how to set up issho for local development.
Fork the issho repo on GitHub, and install the pre-commit hooks.
Clone your fork locally:
$ git clone git@github.com:your_name_here/issho.git $ pre-commit install
Install your local copy into a virtualenv. Assuming you have virtualenvwrapper installed, this is how you set up your fork for local development:
$ mkvirtualenv issho $ cd issho/ $ python setup.py develop
Create a branch for local development:
$ git checkout -b name-of-your-bugfix-or-feature
Now you can make your changes locally.
When you’re done making changes, check that your changes pass flake8 and the tests, including testing other Python versions with tox:
$ flake8 issho tests $ python setup.py test or py.test $ tox
To get flake8 and tox, just pip install them into your virtualenv.
Commit your changes and push your branch to GitHub:
$ git add . $ git commit -m "Your detailed description of your changes." $ git push origin name-of-your-bugfix-or-feature
Submit a pull request through the GitHub website.
Pull Request Guidelines¶
Before you submit a pull request, check that it meets these guidelines:
The pull request should include tests.
If the pull request adds functionality, the docs should be updated. Put your new functionality into a function with a docstring, and add the feature to the list in README.rst.
The pull request should work for Python 3.5, 3.6, 3.7, and for PyPy. Check https://travis-ci.org/michaelbilow/issho/pull_requests and make sure that the tests pass for all supported Python versions.
Deploying¶
A reminder for the maintainers on how to deploy. Make sure all your changes are committed (including an entry in HISTORY.rst). Then run:
$ punch --part patch # possible: major / minor / patch
$ git push
$ git push --tags
Travis will then deploy to PyPI if tests pass.
Credits¶
Development Lead¶
Michael Bilow <michael.k.bilow@gmail.com>
Contributors¶
None yet. Why not be the first?
History¶
0.5.1 (2019-06-24)¶
Add
hadoop
operatorsAllow some simple runtime execution by overriding
__getattr__
Add new operators to docs
0.5.0 (2019-06-24)¶
Error release
0.4.2 (2019-06-22)¶
Add
spark
andspark_submit
operatorUpgrade to
paramiko >=2.5.0
, fixing bug with recent versions ofcryptography
0.3.6 (2019-06-06)¶
Format code using black
Update install to include conda-forge path
0.3.5 (2019-05-23)¶
Delete blank top line from beeline by default.
0.3.4 (2019-05-23)¶
Allow hive to output to a file
Add environment variable profiles with
issho env
Update docs
Allow users to re-use variables that have been set in previous configurations
0.3.3 (2019-05-18)¶
Fix bug related to paramiko v2.4 not liking the Mac version of ssh keys.
Added clear error messages to fix.
0.3.1 (2019-04-11)¶
Fix bug regarding ssh vs local user identity
0.3.0 (2019-04-09)¶
Add more configuration and reduce variables on the
Issho
object.Allow
prompt_toolkit>=1.0.10
to allowjupyter
interoperability.Set up useful passwords using
issho config
0.2.5 (2019-03-25)¶
Clean up hive operator and sftp callback
Note that
issho
is incompatible withjupyter_console<6.0
andipython<7.0
0.2.4 (2019-03-25)¶
Fix bug in hive operator
0.2.3 (2019-03-25)¶
Add
.readthedocs.yml
; docs build now passes.
0.2.2 (2019-03-24)¶
Clean up docs, try to have a passing build
0.2.1 (2019-03-22)¶
Add docstrings for all functions
Add autodocs
Switch out
bumpversion
forbump2version
0.2.0 (2019-03-22)¶
Add Hive function
Add configuration CLI
Fix Travis config to Python 3.5+
0.1.0 (2019-02-26)¶
First release on PyPI.