Skip to content
This repository was archived by the owner on Nov 12, 2025. It is now read-only.

Commit 6254bf2

Browse files
plamuttswast
andauthored
chore: Release v2.0.0 (#64)
* chore: release v2.0.0 * Update CHANGELOG.md * Replace PROJECT_ID with GOOGLE_CLOUD_PROJECT in test * Do not add google.cloud.bigquery as namespace package * Install the library as non-editable in tests This avoids import errors from google.cloud.bigquery.* namespace. * Fix test coverage plugin paths * Regenerate code with different namespace (bigquery_storage) * Adjust import paths to bigquery_storage namespace * Adjust docs to bigquery_storage namespace * Adjust UPGRADING guide to changed namespace Co-authored-by: Tim Swast <swast@google.com>
1 parent 0a0eb2e commit 6254bf2

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

41 files changed

+173
-171
lines changed

CHANGELOG.md

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,17 @@
44

55
[1]: https://pypi.org/project/google-cloud-bigquery-storage/#history
66

7+
## 2.0.0
8+
9+
09-24-2020 08:21 PDT
10+
11+
### Implementation Changes
12+
13+
- Transition the library to microgenerator. ([#62](https://github.com/googleapis/python-bigquery-storage/pull/62))
14+
This is a **breaking change** that introduces several **method signature changes** and **drops support
15+
for Python 2.7 and 3.5**. See [migration guide](https://googleapis.dev/python/bigquerystorage/latest/UPGRADING.html)
16+
for more info.
17+
718
## 1.1.0
819

920
09-14-2020 08:51 PDT

UPGRADING.md

Lines changed: 10 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -33,9 +33,10 @@ The 2.0.0 release requires Python 3.6+.
3333

3434
## Import Path
3535

36-
The library was moved into `google.cloud.bigquery` namespace. It is recommended
37-
to use this path in order to reduce the chance of future compatibility issues
38-
in case the library is restuctured internally.
36+
The library's top-level namespace is `google.cloud.bigquery_storage`. Importing
37+
from `google.cloud.bigquery_storage_v1` still works, but it is advisable to use
38+
the `google.cloud.bigquery_storage` path in order to reduce the chance of future
39+
compatibility issues should the library be restuctured internally.
3940

4041
**Before:**
4142
```py
@@ -44,7 +45,7 @@ from google.cloud.bigquery_storage_v1 import BigQueryReadClient
4445

4546
**After:**
4647
```py
47-
from google.cloud.bigquery.storage import BigQueryReadClient
48+
from google.cloud.bigquery_storage import BigQueryReadClient
4849
```
4950

5051

@@ -65,7 +66,7 @@ data_format = BigQueryReadClient.enums.DataFormat.ARROW
6566

6667
**After:**
6768
```py
68-
from google.cloud.bigquery.storage import types
69+
from google.cloud.bigquery_storage import types
6970

7071
data_format = types.DataFormat.ARROW
7172
```
@@ -157,13 +158,13 @@ session = client.create_read_session(
157158

158159
**After:**
159160
```py
160-
from google.cloud.bigquery import storage
161+
from google.cloud import bigquery_storage
161162

162-
client = storage.BigQueryReadClient()
163+
client = bigquery_storage.BigQueryReadClient()
163164

164-
requested_session = storage.types.ReadSession(
165+
requested_session = bigquery_storage.types.ReadSession(
165166
table="projects/PROJECT_ID/datasets/DATASET_ID/tables/TABLE_ID",
166-
data_format=storage.types.DataFormat.ARROW,
167+
data_format=bigquery_storage.types.DataFormat.ARROW,
167168
)
168169
session = client.create_read_session(
169170
request={
File renamed without changes.
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
Services for Google Cloud Bigquery Storage v1 API
22
=================================================
33

4-
.. automodule:: google.cloud.bigquery.storage_v1.services.big_query_read
4+
.. automodule:: google.cloud.bigquery_storage_v1.services.big_query_read
55
:members:
66
:inherited-members:
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
Types for Google Cloud Bigquery Storage v1 API
22
==============================================
33

4-
.. automodule:: google.cloud.bigquery.storage_v1.types
4+
.. automodule:: google.cloud.bigquery_storage_v1.types
55
:members:

docs/index.rst

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -18,9 +18,9 @@ API Reference
1818
.. toctree::
1919
:maxdepth: 2
2020

21-
storage_v1/library
22-
storage_v1/services
23-
storage_v1/types
21+
bigquery_storage_v1/library
22+
bigquery_storage_v1/services
23+
bigquery_storage_v1/types
2424

2525

2626
Migration Guide

google/cloud/bigquery/storage_v1/__init__.py

Lines changed: 0 additions & 51 deletions
This file was deleted.
Lines changed: 15 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -16,22 +16,22 @@
1616
#
1717

1818
from google.cloud.bigquery_storage_v1 import BigQueryReadClient
19-
from google.cloud.bigquery_storage_v1 import types
19+
from google.cloud.bigquery_storage_v1 import gapic_types as types
2020
from google.cloud.bigquery_storage_v1 import __version__
21-
from google.cloud.bigquery.storage_v1.types.arrow import ArrowRecordBatch
22-
from google.cloud.bigquery.storage_v1.types.arrow import ArrowSchema
23-
from google.cloud.bigquery.storage_v1.types.avro import AvroRows
24-
from google.cloud.bigquery.storage_v1.types.avro import AvroSchema
25-
from google.cloud.bigquery.storage_v1.types.storage import CreateReadSessionRequest
26-
from google.cloud.bigquery.storage_v1.types.storage import ReadRowsRequest
27-
from google.cloud.bigquery.storage_v1.types.storage import ReadRowsResponse
28-
from google.cloud.bigquery.storage_v1.types.storage import SplitReadStreamRequest
29-
from google.cloud.bigquery.storage_v1.types.storage import SplitReadStreamResponse
30-
from google.cloud.bigquery.storage_v1.types.storage import StreamStats
31-
from google.cloud.bigquery.storage_v1.types.storage import ThrottleState
32-
from google.cloud.bigquery.storage_v1.types.stream import DataFormat
33-
from google.cloud.bigquery.storage_v1.types.stream import ReadSession
34-
from google.cloud.bigquery.storage_v1.types.stream import ReadStream
21+
from google.cloud.bigquery_storage_v1.types.arrow import ArrowRecordBatch
22+
from google.cloud.bigquery_storage_v1.types.arrow import ArrowSchema
23+
from google.cloud.bigquery_storage_v1.types.avro import AvroRows
24+
from google.cloud.bigquery_storage_v1.types.avro import AvroSchema
25+
from google.cloud.bigquery_storage_v1.types.storage import CreateReadSessionRequest
26+
from google.cloud.bigquery_storage_v1.types.storage import ReadRowsRequest
27+
from google.cloud.bigquery_storage_v1.types.storage import ReadRowsResponse
28+
from google.cloud.bigquery_storage_v1.types.storage import SplitReadStreamRequest
29+
from google.cloud.bigquery_storage_v1.types.storage import SplitReadStreamResponse
30+
from google.cloud.bigquery_storage_v1.types.storage import StreamStats
31+
from google.cloud.bigquery_storage_v1.types.storage import ThrottleState
32+
from google.cloud.bigquery_storage_v1.types.stream import DataFormat
33+
from google.cloud.bigquery_storage_v1.types.stream import ReadSession
34+
from google.cloud.bigquery_storage_v1.types.stream import ReadStream
3535

3636
__all__ = (
3737
"__version__",

google/cloud/bigquery_storage_v1/client.py

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -23,8 +23,8 @@
2323

2424
import google.api_core.gapic_v1.method
2525

26-
from google.cloud.bigquery import storage_v1
2726
from google.cloud.bigquery_storage_v1 import reader
27+
from google.cloud.bigquery_storage_v1.services import big_query_read
2828

2929

3030
_SCOPES = (
@@ -33,7 +33,7 @@
3333
)
3434

3535

36-
class BigQueryReadClient(storage_v1.BigQueryReadClient):
36+
class BigQueryReadClient(big_query_read.BigQueryReadClient):
3737
"""Client for interacting with BigQuery Storage API.
3838
3939
The BigQuery storage API can be used to read data stored in BigQuery.
@@ -60,9 +60,9 @@ def read_rows(
6060
to read data.
6161
6262
Example:
63-
>>> from google.cloud.bigquery import storage
63+
>>> from google.cloud import bigquery_storage
6464
>>>
65-
>>> client = storage.BigQueryReadClient()
65+
>>> client = bigquery_storage.BigQueryReadClient()
6666
>>>
6767
>>> # TODO: Initialize ``table``:
6868
>>> table = "projects/{}/datasets/{}/tables/{}".format(
@@ -74,9 +74,9 @@ def read_rows(
7474
>>> # TODO: Initialize `parent`:
7575
>>> parent = 'projects/your-billing-project-id'
7676
>>>
77-
>>> requested_session = storage.types.ReadSession(
77+
>>> requested_session = bigquery_storage.types.ReadSession(
7878
... table=table,
79-
... data_format=storage.types.DataFormat.AVRO,
79+
... data_format=bigquery_storage.types.DataFormat.AVRO,
8080
... )
8181
>>> session = client.create_read_session(
8282
... parent=parent, read_session=requested_session

0 commit comments

Comments
 (0)