patpitchaya - Fotolia

Tip

How to use REST APIs in Python

Using REST APIs with Python can offer storage administrators increased flexibility in their management tasks, but they need some knowledge of how to code.

Openness is increasing, thanks to the likes of Linux, OpenStack and Ceph. This trend has given rise to open APIs, such as REST. With REST APIs, you can integrate storage management with your own scripts, which makes IT more flexible and programmable. If you're a storage admin and you want to manage resources more openly, then you need to know a little bit about using REST APIs in Python, a common scripting language.

REST APIs provide easy access to countless web-based services. Whether you want to send a tweet, upload a document to Dropbox, create your own cloud server or access publicly available data, REST APIs underpin everything. Using these services from Python is quite easy, especially with a native Python wrapper or software development kit (SDK).

Using REST APIs in Python

The REST protocol is a popular approach that drives the development of modern web and mobile applications. REST is a set of conventions for connecting to a service, usually via HTTPS requests. The methods defined by the HTTP protocol, such as GET, PUT, POST, DELETE and OPTIONS, may be invoked by the client to specify an action and an endpoint.

For example, to a client's <GET /v2/sizes> request, the service responds by listing the collection sizes. To be able to manage the number of returned items, REST APIs in Python frequently support filtering and pagination using parameters such as <GET /v2/droplets?tag_name=prod-nl> and <GET /v2/sizes?page=1&per_page=10>.

In general, the response is formatted as JSON, though XML could also be an option. To get the details of an item in the sizes collection, the client would need to further invoke the <GET /v2/sizes/<item_id>> call, where <item_id> is the identifier of a specific item. The REST specifications mandate that the response must contain a complete representation of the item. This way, the client may request the modification of the item by calling either the PUT or POST method and providing the complete representation of the item's new state. As a consequence, the server does not need to keep track of state; REST services scale very well.

Depending on the type of service, providers will often require an API key to authenticate incoming requests. To receive such a key, register at the provider. You can usually request API keys on the Profile or Settings page.

The following examples show how to use the Python SDK provided by Amazon Web Services (AWS) to access files stored in its Simple Storage Service (S3). This SDK supports many more functions, but the goal of the examples is to provide an uncomplicated demonstration of the concepts.

Example 1: Direct access

In this approach, the client program interacts directly with the API endpoint through HTTPS using Python's request and json modules. The benefit of directly communicating with the REST APIs in Python is that the program does not require any specialized modules. On the other hand, you must understand the data structures provided by the REST API service.

In addition to the API logic, you must also implement authentication and authorization functions, which may require a significant amount of code. S3 is a good example in this regard. It documents and supports access to the service without using the SDK; the authentication and authorization scheme requires significant effort to implement and maintain.

Example 2: Using a Python wrapper

A more convenient way to consume the services provided by REST APIs in Python is to use an SDK, which takes care of the communication on the HTTP level and transforms the JSON responses into Python objects. Examples of this kind of wrapper are Tweepy, the Twitter API wrapper; Boto3 from AWS; and Apache Libcloud, a generic Python library to access infrastructure-as-a-service providers.

The following example uses the default credential store of the AWS Python SDK, which is the $HOME/.aws directory. For additional security, this directory should only be readable by the application's service account. The AWS S3 service requires two configuration files:

$HOME/.aws/config:

01

02

03

[default]

output = json

region = eu-central-1

And $HOME/.aws/credentials, with content similar to this:

01

02

03

[default]

aws_access_key_id = AKIAJXXXXXXXXXXXXXXX

aws_secret_access_key = XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

Make sure that the required module is installed by using the command <pip install boto3>.

Line 2 imports the boto3 module. Line 12 connects using the default credentials to the AWS S3 services. Line 15 gets a list of objects in the myfiles S3 bucket. Besides the names of the files, the item variable will contain additional information. And line 20 downloads a file from the bucket into the current directory.

01

02

03

04

05

06

07

08

09

10

11

12

13

14

15

16

17

18

19

20

import sys

import boto3

 

if len(sys.argv) > 2:

    bucket = sys.argv[1]

    obj = sys.argv[2]

else:

    bucket = 'myfiles'

    obj = 'forrest.jpg'

 

# Session

s3 = boto3.client('s3')

 

# List content bucket

items = s3.list_objects_v2(Bucket=bucket, FetchOwner=True)

for f in items.get('Contents'):

    print(f['Key'], f['Size'])

 

# Download a file

s3.download_file(Bucket=bucket, Key=obj, Filename=obj)

The output produced by this program is:

 

forrest.jpg 1675801
mountains.jpg 2005500

Dig Deeper on Cloud storage

Disaster Recovery
Data Backup
Data Center
Sustainability and ESG
Close