Although strictly speaking, you can do all the work directly on PiCloud (where I'm handling the dependencies), you'll likely want to get PiCloud, boto, s3cmd set up locally. See Day 19 notes and Day 16 PiCloud intro for a refresher. One big reason for working locally is that you'll get charged for the time you are running a PiCloud notebook server -- and when you are thinking, it's nice to not have to worry about the time (even if it is $0.05/hour for a running a c1 PiCloud instance.)
Also ask for help if you are having problems.
Listing what's in your PiCloud bucket
import cloud
cloud.bucket.list()
[u'notebook/Day_02_class_starter.ipynb', u'notebook/Day_02_completed.ipynb', u'notebook/Day_04_completed.ipynb', u'notebook/Day_04_starter.ipynb', u'notebook/Day_05_plotting.ipynb', u'notebook/Day_07_array_len_and_multiply.ipynb', u'notebook/Day_08_basemap_globe_example.ipynb', u'notebook/Day_08_completed.ipynb', u'notebook/Day_08_freebase_intro.ipynb', u'notebook/Day_08_starter.ipynb', u'notebook/Day_10_A_fixed_width_parsing_completed.ipynb', u'notebook/Day_10_freebase_cursor_completed.ipynb', u'notebook/Day_10_requests_lxml.ipynb', u'notebook/Day_14_PfDA_revisited.ipynb', u'notebook/Day_14_PfDA_starter.ipynb', u'notebook/Day_14_basemap_redux.ipynb', u'notebook/Day_14_date_time.ipynb', u'notebook/Day_15_Sample_Python_Questions.ipynb', u'notebook/Day_16_PiCloud_intro.ipynb', u'notebook/Day_17_Midterm.ipynb', u'notebook/Day_17_Midterm_with_Key.ipynb', u'notebook/Day_18_Common_Crawl.ipynb', u'notebook/Day_19_CC_etc.ipynb', u'notebook/Day_20_CommonCrawl.ipynb', u'notebook/Primer.ipynb', u'notebook/basemap_example.ipynb', u'notebook/notebook_javascript_examples.ipynb', u'notebook/vtk_example.ipynb']
# http://docs.picloud.com/moduledoc.html#module-cloud.bucket
import os
# only if we not running on picloud....
if not os.path.exists('/home/picloud/notebook'):
pass
# normally I keep this line commented to prevent accidental copying if I run the notebook through.
cloud.bucket.put('Day_20_Moving_files_to_PiCloud.ipynb', prefix='notebook')
import os
if not os.path.exists('/home/picloud/notebook'):
pass
# normally I keep this line commented to prevent accidental copying if I run the notebook through.
# note the new local name -- to make it less likely to overwrite something I'm doing locally.
#cloud.bucket.get('notebook/Day_20_CommonCrawl.ipynb', 'Day_20_CommonCrawl_from_picloud.ipynb')
Warning: I don't think you'll immediately see the notebook changes reflected in an already running PiCloud notebook server -- at least, that was my experience.
There are other ways to interact with PiCloud -- using picloud ssh-info and scp --See SSH into a job and some rough notes. The following code shows how to use picloud ssh-info JID
to get the right ssh scp commands.
You can read off the job id for your PiCloud notebook server from the upper right corner of https://www.picloud.com/accounts/notebook/:
import re
# put the job id of your notebook server after ssh-info
NOTEBOOK_SERVER_RUNNING = False
NOTEBOOK_SERVER_JID = 501
def to_picloud(nb_name):
scp_to_command = "scp -q -i {identity} -P {port} {nb_name} {username}@{address}:/home/picloud/notebook/".format(nb_name=nb_name, **ssh_info_output)
return scp_to_command
if NOTEBOOK_SERVER_RUNNING:
ssh_info_output = !picloud ssh-info $NOTEBOOK_SERVER_JID
ssh_info_output = dict(zip( *[filter(None, re.split("\s+", l)) for l in ssh_info_output]))
#print ssh_info_output
ssh_command = "ssh -q -i {identity} {username}@{address} -p {port}".format(**ssh_info_output)
print ssh_command
print to_picloud("Day_20_CommonCrawl.ipynb")
# you can even run the scp command from within iPython notebook -- uncomment following lines
# to_picloud = to_picloud("Day_20_CommonCrawl.ipynb")
# ! $to_picloud
ssh -q -i /Users/raymondyee/.picloud/credentials/6362/id_rsa emp12@ec2-54-234-110-0.compute-1.amazonaws.com -p 21200 scp -q -i /Users/raymondyee/.picloud/credentials/6362/id_rsa -P 21200 Day_20_CommonCrawl.ipynb emp12@ec2-54-234-110-0.compute-1.amazonaws.com:/home/picloud/notebook/
Running scp to the live notebook server machine will actually update the notebooks.
I've packaged up a utility to move files using ssh and scp
from wwod import picloud
<module 'wwod.picloud' from 'wwod/picloud.py'>
# pass in the job info for your running notebook server id -- if you have a notebook server running
import cloud
cloud.shortcuts.ssh.get_ssh_info(506)
{'address': 'ec2-107-22-29-58.compute-1.amazonaws.com', 'identity': '/Users/raymondyee/.picloud/credentials/6362/id_rsa', 'port': 20500, 'username': 'emp5'}
import cloud
cloud.shortcuts.ssh.get_ssh_command(506)
'ssh -p 20500 -i /Users/raymondyee/.picloud/credentials/6362/id_rsa -o StrictHostKeyChecking=no -o ConnectTimeout=60 -o LogLevel=QUIET -o UserKnownHostsFile=/dev/null emp5@ec2-107-22-29-58.compute-1.amazonaws.com'
# added scp
from wwod import picloud
picloud.to_picloud_cmd('me', 506)
'scp -q -i /Users/raymondyee/.picloud/credentials/6362/id_rsa -P 20500 me emp5@ec2-107-22-29-58.compute-1.amazonaws.com:/home/picloud/notebook/'