Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix #77 - Initialized cloud dns. #78

Closed
wants to merge 21 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
21 commits
Select commit Hold shift + click to select a range
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions .travis.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
language: python
python:
- "2.6"
- "2.7"
# command to install dependencies
install: "pip install . unittest2"
# command to run tests
script: nosetests
60 changes: 44 additions & 16 deletions README.rst
Original file line number Diff line number Diff line change
@@ -1,28 +1,36 @@
Google Cloud
============
Google Cloud Python Client
==========================

Official documentation
----------------------
The goal of this project is to make it really simple and Pythonic
to use Google Cloud Platform services.

If you just want to **use** the library
(not contribute to it),
check out the official documentation:
http://GoogleCloudPlatform.github.io/gcloud-python/
.. image:: https://travis-ci.org/GoogleCloudPlatform/gcloud-python.svg?branch=master
:target: https://travis-ci.org/GoogleCloudPlatform/gcloud-python

Incredibly quick demo
---------------------
Quickstart
----------

Start by cloning the repository::
The library is ``pip``-installable::

$ git clone git://github.com/GoogleCloudPlatform/gcloud-python.git
$ cd gcloud
$ python setup.py develop
$ pip install gcloud
$ python -m gcloud.storage.demo # Runs the storage demo!

Documentation
-------------

- `gcloud docs (browse all services, quick-starts, walk-throughs) <http://GoogleCloudPlatform.github.io/gcloud-python/>`_
- `gcloud.datastore API docs <http://googlecloudplatform.github.io/gcloud-python/datastore-api.html>`_
- `gcloud.storage API docs <http://googlecloudplatform.github.io/gcloud-python/storage-api.html>`_
- gcloud.bigquery API docs (*coming soon)*
- gcloud.compute API docs *(coming soon)*
- gcloud.dns API docs *(coming soon)*
- gcloud.sql API docs *(coming soon)*

I'm getting weird errors... Can you help?
-----------------------------------------

Chances are you have some dependency problems,
if you're on Ubuntu,
Chances are you have some dependency problems...
If you're on Ubuntu,
try installing the pre-compiled packages::

$ sudo apt-get install python-crypto python-openssl libffi-dev
Expand All @@ -32,6 +40,7 @@ or try installing the development packages
and then ``pip install`` the dependencies again::

$ sudo apt-get install python-dev libssl-dev libffi-dev
$ pip install gcloud

How do I build the docs?
------------------------
Expand All @@ -50,4 +59,23 @@ Make sure you have ``nose`` installed and::

$ git clone git://github.com/GoogleCloudPlatform/gcloud-python.git
$ pip install unittest2 nose
$ cd gcloud-python
$ nosetests

How can I contribute?
---------------------

Before we can accept any pull requests
we have to jump through a couple of legal hurdles,
primarily a Contributor License Agreement (CLA):

- **If you are an individual writing original source code**
and you're sure you own the intellectual property,
then you'll need to sign an `individual CLA
<http://code.google.com/legal/individual-cla-v1.0.html>`_.
- **If you work for a company that wants to allow you to contribute your work**,
then you'll need to sign a `corporate CLA
<http://code.google.com/legal/corporate-cla-v1.0.html>`_.

You can sign these electronically (just scroll to the bottom).
After that, we'll be able to accept your pull requests.
2 changes: 1 addition & 1 deletion docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ Google Cloud Python API

.. warning::
This library is **still under construction**
and is **not** the official Google Python API client library.
and is **not** the official Google Cloud Python API client library.

Getting started
---------------
Expand Down
5 changes: 1 addition & 4 deletions docs/storage-getting-started.rst
Original file line number Diff line number Diff line change
Expand Up @@ -217,10 +217,7 @@ otherwise you'll get an error.

If you have a full bucket, you can delete it this way::

>>> bucket = connection.get_bucket('my-bucket')
>>> for key in bucket:
... key.delete()
>>> bucket.delete()
>>> bucket = connection.get_bucket('my-bucket', force=True)

Listing available buckets
-------------------------
Expand Down
84 changes: 84 additions & 0 deletions gcloud/connection.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,8 @@
import httplib2
import json
import urllib

from gcloud import exceptions


class Connection(object):
Expand Down Expand Up @@ -42,3 +46,83 @@ def http(self):
self._http = self._credentials.authorize(self._http)
return self._http


class JsonConnection(Connection):

API_BASE_URL = 'https://www.googleapis.com'
"""The base of the API call URL."""

_EMPTY = object()
"""A pointer to represent an empty value for default arguments."""

def __init__(self, project=None, *args, **kwargs):

super(JsonConnection, self).__init__(*args, **kwargs)

self.project = project

def build_api_url(self, path, query_params=None, api_base_url=None,
api_version=None):

url = self.API_URL_TEMPLATE.format(
api_base_url=(api_base_url or self.API_BASE_URL),
api_version=(api_version or self.API_VERSION),
path=path)

This comment was marked as spam.


query_params = query_params or {}
query_params.update({'project': self.project})

This comment was marked as spam.

url += '?' + urllib.urlencode(query_params)

return url

def make_request(self, method, url, data=None, content_type=None,

This comment was marked as spam.

headers=None):

headers = headers or {}
headers['Accept-Encoding'] = 'gzip'

if data:
content_length = len(str(data))
else:
content_length = 0

headers['Content-Length'] = content_length

if content_type:
headers['Content-Type'] = content_type

return self.http.request(uri=url, method=method, headers=headers,
body=data)

def api_request(self, method, path=None, query_params=None,
data=None, content_type=None,
api_base_url=None, api_version=None,
expect_json=True):

url = self.build_api_url(path=path, query_params=query_params,
api_base_url=api_base_url,
api_version=api_version)

# Making the executive decision that any dictionary
# data will be sent properly as JSON.
if data and isinstance(data, dict):
data = json.dumps(data)
content_type = 'application/json'

response, content = self.make_request(
method=method, url=url, data=data, content_type=content_type)

# TODO: Add better error handling.
if response.status == 404:
raise exceptions.NotFoundError(response, content)
elif not 200 <= response.status < 300:
raise exceptions.ConnectionError(response, content)

if content and expect_json:
# TODO: Better checking on this header for JSON.
content_type = response.get('content-type', '')

This comment was marked as spam.

if not content_type.startswith('application/json'):
raise TypeError('Expected JSON, got %s' % content_type)
return json.loads(content)

return content
70 changes: 69 additions & 1 deletion gcloud/datastore/query.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
from gcloud.datastore import datastore_v1_pb2 as datastore_pb
from gcloud.datastore import helpers
from gcloud.datastore.entity import Entity
from gcloud.datastore.key import Key


# TODO: Figure out how to properly handle namespaces.
Expand Down Expand Up @@ -132,6 +133,72 @@ def filter(self, expression, value):
setattr(property_filter.value, attr_name, pb_value)
return clone

def ancestor(self, ancestor):
"""Filter the query based on an ancestor.

This will return a clone of the current :class:`Query`
filtered by the ancestor provided.

For example::

>>> parent_key = Key.from_path('Person', '1')
>>> query = dataset.query('Person')
>>> filtered_query = query.ancestor(parent_key)

If you don't have a :class:`gcloud.datastore.key.Key` but just
know the path, you can provide that as well::

>>> query = dataset.query('Person')
>>> filtered_query = query.ancestor(['Person', '1'])

Each call to ``.ancestor()`` returns a cloned :class:`Query:,
however a query may only have one ancestor at a time.

:type ancestor: :class:`gcloud.datastore.key.Key` or list
:param ancestor: Either a Key or a path of the form
``['Kind', 'id or name', 'Kind', 'id or name', ...]``.

:rtype: :class:`Query`
:returns: A Query filtered by the ancestor provided.
"""

clone = self._clone()

# If an ancestor filter already exists, remove it.
for i, filter in enumerate(clone._pb.filter.composite_filter.filter):
property_filter = filter.property_filter
if property_filter.operator == datastore_pb.PropertyFilter.HAS_ANCESTOR:
del clone._pb.filter.composite_filter.filter[i]

# If we just deleted the last item, make sure to clear out the filter
# property all together.
if not clone._pb.filter.composite_filter.filter:
clone._pb.ClearField('filter')

# If the ancestor is None, just return (we already removed the filter).
if not ancestor:
return clone

# If a list was provided, turn it into a Key.
if isinstance(ancestor, list):
ancestor = Key.from_path(*ancestor)

# If we don't have a Key value by now, something is wrong.
if not isinstance(ancestor, Key):
raise TypeError('Expected list or Key, got %s.' % type(ancestor))

# Get the composite filter and add a new property filter.
composite_filter = clone._pb.filter.composite_filter
composite_filter.operator = datastore_pb.CompositeFilter.AND

# Filter on __key__ HAS_ANCESTOR == ancestor.
ancestor_filter = composite_filter.filter.add().property_filter
ancestor_filter.property.name = '__key__'
ancestor_filter.operator = datastore_pb.PropertyFilter.HAS_ANCESTOR
ancestor_filter.value.key_value.CopyFrom(ancestor.to_protobuf())

return clone

def kind(self, *kinds):
"""Get or set the Kind of the Query.

Expand Down Expand Up @@ -244,4 +311,5 @@ def fetch(self, limit=None):
entity_pbs = self.dataset().connection().run_query(
query_pb=clone.to_protobuf(), dataset_id=self.dataset().id())

return [Entity.from_protobuf(entity) for entity in entity_pbs]
return [Entity.from_protobuf(entity, dataset=self.dataset())
for entity in entity_pbs]
92 changes: 92 additions & 0 deletions gcloud/dns/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,92 @@
"""Shortcut methods for getting set up with Google Cloud DNS.

You'll typically use these to get started with the API:

>>> import gcloud.dns
>>> zone = gcloud.dns.get_zone('zone-name-here',
'long-email@googleapis.com',
'/path/to/private.key')

The main concepts with this API are:

- :class:`gcloud.dns.connection.Connection`
which represents a connection between your machine
and the Cloud DNS API.

- :class:`gcloud.dns.zone.Zone`
which represents a particular zone.
"""


__version__ = '0.1'

# TODO: Allow specific scopes and authorization levels.
SCOPE = ('https://www.googleapis.com/auth/cloud-platform',
'https://www.googleapis.com/auth/ndev.clouddns.readonly',
'https://www.googleapis.com/auth/ndev.clouddns.readwrite')
"""The scope required for authenticating as a Cloud DNS consumer."""


def get_connection(project, client_email, private_key_path):
"""Shortcut method to establish a connection to Cloud DNS.

Use this if you are going to access several zones
with the same set of credentials:

>>> from gcloud import dns
>>> connection = dns.get_connection(project, email, key_path)
>>> zone1 = connection.get_zone('zone1')
>>> zone2 = connection.get_zone('zone2')

:type project: string
:param project: The name of the project to connect to.

:type client_email: string
:param client_email: The e-mail attached to the service account.

:type private_key_path: string
:param private_key_path: The path to a private key file (this file was
given to you when you created the service
account).

:rtype: :class:`gcloud.dns.connection.Connection`
:returns: A connection defined with the proper credentials.
"""

from gcloud.credentials import Credentials
from gcloud.dns.connection import Connection

credentials = Credentials.get_for_service_account(
client_email, private_key_path, scope=SCOPE)
return Connection(project=project, credentials=credentials)


def get_zone(zone, project, client_email, private_key_path):
"""Shortcut method to establish a connection to a particular zone.

You'll generally use this as the first call to working with the API:

>>> from gcloud import dns
>>> zone = dns.get_zone(zone, project, email, key_path)

:type zone: string
:param zone: The id of the zone you want to use.
This is akin to a disk name on a file system.

:type project: string
:param project: The name of the project to connect to.

:type client_email: string
:param client_email: The e-mail attached to the service account.

:type private_key_path: string
:param private_key_path: The path to a private key file (this file was
given to you when you created the service
account).

:rtype: :class:`gcloud.dns.zone.Zone`
:returns: A zone with a connection using the provided credentials.
"""

connection = get_connection(project, client_email, private_key_path)
return connection.get_zone(zone)
Loading