Skip to content

Commit 52ab4de

Browse files
plamutparthea
authored andcommitted
chore: transition the library to microgenerator (#62)
* chore: remove old GAPIC code for v1 API * Regenerate the v1 API with microgenerator * Adjust dependencies and classifiers in setup.py * Fix types aggregation in types.py * Adjust import paths * Fix and adjust unit tests * Fix and adjust system tests * Adjust unit test coverage threshold Not all paths are covered, not even in the generated code, thus the adjustment is necessary. * Fix docs build * Adjust quickstart sample * Adjust sample in client docstring * Remove beta API code and docs * Simplify synth replacement rules and regenerate Rules conditionally matching versions other than v1 are not needed anymore. * Consolidate imports in google.cloud.bigquery.storage * Use gogole.cloud.bigquery.storage as import path * Hide async client from most import paths * Use GAPIC client mock in ReadRowsStream tests * Remove redundant installations in nox sessions * Include manual classes in reference docs * Add UPGRADING guide * Add minor CHANGELOG improvements
1 parent 6da77e5 commit 52ab4de

File tree

123 files changed

+4852
-14372
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

123 files changed

+4852
-14372
lines changed

packages/google-cloud-bigquery-storage/.kokoro/release/common.cfg

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -23,14 +23,14 @@ env_vars: {
2323
value: "github/python-bigquery-storage/.kokoro/release.sh"
2424
}
2525

26-
# Fetch PyPI password
27-
before_action {
28-
fetch_keystore {
29-
keystore_resource {
30-
keystore_config_id: 73713
31-
keyname: "google_cloud_pypi_password"
32-
}
33-
}
26+
# Fetch PyPI password
27+
before_action {
28+
fetch_keystore {
29+
keystore_resource {
30+
keystore_config_id: 73713
31+
keyname: "google_cloud_pypi_password"
32+
}
33+
}
3434
}
3535

3636
# Tokens needed to report release status back to GitHub

packages/google-cloud-bigquery-storage/CONTRIBUTING.rst

Lines changed: 0 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -80,25 +80,6 @@ We use `nox <https://nox.readthedocs.io/en/latest/>`__ to instrument our tests.
8080

8181
.. nox: https://pypi.org/project/nox/
8282
83-
Note on Editable Installs / Develop Mode
84-
========================================
85-
86-
- As mentioned previously, using ``setuptools`` in `develop mode`_
87-
or a ``pip`` `editable install`_ is not possible with this
88-
library. This is because this library uses `namespace packages`_.
89-
For context see `Issue #2316`_ and the relevant `PyPA issue`_.
90-
91-
Since ``editable`` / ``develop`` mode can't be used, packages
92-
need to be installed directly. Hence your changes to the source
93-
tree don't get incorporated into the **already installed**
94-
package.
95-
96-
.. _namespace packages: https://www.python.org/dev/peps/pep-0420/
97-
.. _Issue #2316: https://github.com/GoogleCloudPlatform/google-cloud-python/issues/2316
98-
.. _PyPA issue: https://github.com/pypa/packaging-problems/issues/12
99-
.. _develop mode: https://setuptools.readthedocs.io/en/latest/setuptools.html#development-mode
100-
.. _editable install: https://pip.pypa.io/en/stable/reference/pip_install/#editable-installs
101-
10283
*****************************************
10384
I'm getting weird errors... Can you help?
10485
*****************************************

packages/google-cloud-bigquery-storage/README.rst

Lines changed: 7 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -49,11 +49,14 @@ dependencies.
4949

5050
Supported Python Versions
5151
^^^^^^^^^^^^^^^^^^^^^^^^^
52-
Python >= 3.5
52+
Python >= 3.6
5353

54-
Deprecated Python Versions
55-
^^^^^^^^^^^^^^^^^^^^^^^^^^
56-
Python == 2.7. Python 2.7 support will be removed on January 1, 2020.
54+
Unsupported Python Versions
55+
^^^^^^^^^^^^^^^^^^^^^^^^^^^
56+
Python == 2.7, Python == 3.5.
57+
58+
The last version of this library compatible with Python 2.7 and 3.5 is
59+
``google-cloud-bigquery-storage==1.1.0``.
5760

5861

5962
Mac/Linux
Lines changed: 282 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,282 @@
1+
<!--
2+
Copyright 2020 Google LLC
3+
Licensed under the Apache License, Version 2.0 (the "License");
4+
you may not use this file except in compliance with the License.
5+
You may obtain a copy of the License at
6+
https://www.apache.org/licenses/LICENSE-2.0
7+
Unless required by applicable law or agreed to in writing, software
8+
distributed under the License is distributed on an "AS IS" BASIS,
9+
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
10+
See the License for the specific language governing permissions and
11+
limitations under the License.
12+
-->
13+
14+
15+
# 2.0.0 Migration Guide
16+
17+
The 2.0 release of the `google-cloud-bigquery-storage` client is a significant
18+
upgrade based on a [next-gen code generator](https://github.com/googleapis/gapic-generator-python),
19+
and includes substantial interface changes. Existing code written for earlier versions
20+
of this library will likely require updates to use this version. This document
21+
describes the changes that have been made, and what you need to do to update your usage.
22+
23+
If you experience issues or have questions, please file an
24+
[issue](https://github.com/googleapis/python-bigquery-storage/issues).
25+
26+
27+
## Supported Python Versions
28+
29+
> **WARNING**: Breaking change
30+
31+
The 2.0.0 release requires Python 3.6+.
32+
33+
34+
## Import Path
35+
36+
The library was moved into `google.cloud.bigquery` namespace. It is recommended
37+
to use this path in order to reduce the chance of future compatibility issues
38+
in case the library is restuctured internally.
39+
40+
**Before:**
41+
```py
42+
from google.cloud.bigquery_storage_v1 import BigQueryReadClient
43+
```
44+
45+
**After:**
46+
```py
47+
from google.cloud.bigquery.storage import BigQueryReadClient
48+
```
49+
50+
51+
## Enum Types
52+
53+
> **WARNING**: Breaking change
54+
55+
Enum types have been moved. Access them through the `types` module.
56+
57+
**Before:**
58+
```py
59+
from google.cloud.bigquery_storage_v1 import enums
60+
61+
data_format = enums.DataFormat.ARROW
62+
```
63+
64+
data_format = BigQueryReadClient.enums.DataFormat.ARROW
65+
66+
**After:**
67+
```py
68+
from google.cloud.bigquery.storage import types
69+
70+
data_format = types.DataFormat.ARROW
71+
```
72+
73+
Additionally, enums cannot be accessed through the client anymore. The following
74+
code wil _not_ work:
75+
```py
76+
data_format = BigQueryReadClient.enums.DataFormat.ARROW
77+
```
78+
79+
80+
## Clients for Beta APIs
81+
82+
> **WARNING**: Breaking change
83+
84+
Clients for beta APIs have been removed. The following import will _not_ work:
85+
86+
```py
87+
from google.cloud.bigquery_storage_v1beta1 import BigQueryStorageClient
88+
from google.cloud.bigquery_storage_v1beta2.gapic.big_query_read_client import BigQueryReadClient
89+
```
90+
91+
The beta APIs are still available on the server side, but you will need to use
92+
the 1.x version of the library to access them.
93+
94+
95+
## Changed Default Value of the `read_rows()` Method's `metadata` Argument
96+
97+
The `client.read_rows()` method does not accept `None` anymore as a valid value
98+
for the optional `metadata` argument. If not given, an empty tuple is used, but
99+
if you want to explicitly pass an "empty" value, you should use an empty tuple, too.
100+
101+
**Before:**
102+
```py
103+
client.read_rows("stream_name", metadata=None)
104+
```
105+
106+
**After:**
107+
```py
108+
client.read_rows("stream_name", metadata=())
109+
```
110+
111+
OR
112+
113+
```py
114+
client.read_rows("stream_name")
115+
```
116+
117+
118+
## Method Calls
119+
120+
> **WARNING**: Breaking change
121+
122+
Most of the client methods that send requests to the backend expect request objects.
123+
We provide a script that will convert most common use cases.
124+
125+
> One exception to this is the `BigQueryReadClient.read_rows()` which is a hand-written
126+
wrapper around the auto-generated `read_rows()` method.
127+
128+
* Install the library
129+
130+
```py
131+
python3 -m pip install google-cloud-bigquery-storage
132+
```
133+
134+
* The script `fixup_storage_v1_keywords.py` is shipped with the library. It expects
135+
an input directory (with the code to convert) and an empty destination directory.
136+
137+
```sh
138+
$ scripts/fixup_storage_v1_keywords.py --input-directory .samples/ --output-directory samples/
139+
```
140+
141+
**Before:**
142+
```py
143+
from google.cloud import bigquery_storage_v1
144+
145+
client = bigquery_storage_v1.BigQueryReadClient()
146+
147+
requested_session = bigquery_storage_v1.types.ReadSession()
148+
requested_session.table = "projects/PROJECT_ID/datasets/DATASET_ID/tables/TABLE_ID"
149+
requested_session.data_format = bigquery_storage_v1.enums.DataFormat.ARROW
150+
151+
session = client.create_read_session(
152+
"projects/parent_project",
153+
requested_session,
154+
max_stream_count=1,
155+
)
156+
```
157+
158+
**After:**
159+
```py
160+
from google.cloud.bigquery import storage
161+
162+
client = storage.BigQueryReadClient()
163+
164+
requested_session = storage.types.ReadSession(
165+
table="projects/PROJECT_ID/datasets/DATASET_ID/tables/TABLE_ID",
166+
data_format=storage.types.DataFormat.ARROW,
167+
)
168+
session = client.create_read_session(
169+
request={
170+
"parent": "projects/parent_project",
171+
"read_session": requested_session,
172+
"max_stream_count" 1,
173+
},
174+
)
175+
```
176+
177+
### More Details
178+
179+
In `google-cloud-bigquery-storage<2.0.0`, parameters required by the API were positional
180+
parameters and optional parameters were keyword parameters.
181+
182+
**Before:**
183+
```py
184+
def create_read_session(
185+
self,
186+
parent,
187+
read_session,
188+
max_stream_count=None,
189+
retry=google.api_core.gapic_v1.method.DEFAULT,
190+
timeout=google.api_core.gapic_v1.method.DEFAULT,
191+
metadata=None,
192+
):
193+
```
194+
195+
In the `2.0.0` release, methods that interact with the backend have a single
196+
positional parameter `request`. Method docstrings indicate whether a parameter is
197+
required or optional.
198+
199+
Some methods have additional keyword only parameters. The available parameters depend
200+
on the [`google.api.method_signature` annotation](https://github.com/googleapis/python-bigquery-storage/blob/9e1bf910e6f5010f479cf4592e25c3b3eebb456d/google/cloud/bigquery_storage_v1/proto/storage.proto#L73)
201+
specified by the API producer.
202+
203+
204+
**After:**
205+
```py
206+
def create_read_session(
207+
self,
208+
request: storage.CreateReadSessionRequest = None,
209+
*,
210+
parent: str = None,
211+
read_session: stream.ReadSession = None,
212+
max_stream_count: int = None,
213+
retry: retries.Retry = gapic_v1.method.DEFAULT,
214+
timeout: float = None,
215+
metadata: Sequence[Tuple[str, str]] = (),
216+
) -> stream.ReadSession:
217+
```
218+
219+
> **NOTE:** The `request` parameter and flattened keyword parameters for the API are
220+
> mutually exclusive. Passing both will result in an error.
221+
222+
Both of these calls are valid:
223+
224+
```py
225+
session = client.create_read_session(
226+
request={
227+
"parent": "projects/parent_project",
228+
"read_session": requested_session,
229+
"max_stream_count" 1,
230+
},
231+
)
232+
```
233+
234+
```py
235+
response = client.create_read_session(
236+
parent="projects/parent_project",
237+
read_session=requested_session,
238+
max_stream_count=1,
239+
)
240+
```
241+
242+
This call is _invalid_ because it mixes `request` with a keyword argument
243+
`max_stream_count`. Executing this code will result in an error:
244+
245+
```py
246+
session = client.create_read_session(
247+
request={
248+
"parent": "projects/parent_project",
249+
"read_session": requested_session,
250+
},
251+
max_stream_count=1,
252+
)
253+
```
254+
255+
> **NOTE:** The `request` parameter of some methods can also contain a more rich set of
256+
> options that are otherwise not available as explicit keyword only parameters, thus
257+
> these _must_ be passed through `request`.
258+
259+
260+
## Removed Utility Methods
261+
262+
> **WARNING**: Breaking change
263+
264+
Several utility methods such as `project_path()` and `table_path()` have been removed.
265+
These paths must now be constructed manually:
266+
267+
```py
268+
project_path = f"project/{PROJECT_ID}"
269+
table_path = f"projects/{PROJECT_ID}/datasets/{DATASET_ID}/tables/{TABLE_ID}"
270+
```
271+
272+
The two that remained are `read_session_path()` and `read_stream_path()`.
273+
274+
275+
## Removed `client_config` and `channel` Parameter
276+
277+
The client cannot be constructed with `channel` or `client_config` arguments anymore,
278+
these deprecated parameters have been removed.
279+
280+
If you used `client_config` to customize retry and timeout settings for a particular
281+
method, you now need to do it upon method invocation by passing the custom `timeout` and
282+
`retry` arguments, respectively.
Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
../UPGRADING.md

packages/google-cloud-bigquery-storage/docs/conf.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -39,6 +39,7 @@
3939
"sphinx.ext.autosummary",
4040
"sphinx.ext.intersphinx",
4141
"sphinx.ext.coverage",
42+
"sphinx.ext.doctest",
4243
"sphinx.ext.napoleon",
4344
"sphinx.ext.todo",
4445
"sphinx.ext.viewcode",

packages/google-cloud-bigquery-storage/docs/gapic/v1/api.rst

Lines changed: 0 additions & 6 deletions
This file was deleted.

packages/google-cloud-bigquery-storage/docs/gapic/v1/types.rst

Lines changed: 0 additions & 5 deletions
This file was deleted.

packages/google-cloud-bigquery-storage/docs/gapic/v1beta1/api.rst

Lines changed: 0 additions & 6 deletions
This file was deleted.

packages/google-cloud-bigquery-storage/docs/gapic/v1beta1/reader.rst

Lines changed: 0 additions & 6 deletions
This file was deleted.

0 commit comments

Comments
 (0)