-
Notifications
You must be signed in to change notification settings - Fork 1.7k
feat: implement AsyncMultiRangeDownloader with multiplexed bidi-gRPC stream support #16528
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
zhixiangli
wants to merge
18
commits into
googleapis:main
Choose a base branch
from
zhixiangli:zhixiangli/multiplexing-downloader
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
+1,202
−172
Open
Changes from all commits
Commits
Show all changes
18 commits
Select commit
Hold shift + click to select a range
ddb7561
feat: add _StreamMultiplexer for asyncio bidi-gRPC streams
zhixiangli df2ef15
feat: integrate _StreamMultiplexer into AsyncMultiRangeDownloader
zhixiangli cd1620a
test: add system tests for concurrent AsyncMultiRangeDownloader
zhixiangli 6ba44b0
test: use grpc_client_direct in AsyncMultiRangeDownloader system tests
zhixiangli 7decd5e
feat: rename _DEFAULT_PUT_TIMEOUT to _DEFAULT_PUT_TIMEOUT_SECONDS in …
zhixiangli d4d9a48
chore: update copyright year to 2026
zhixiangli eb0cf58
feat: rename my_generation to stream_generation in async_multi_range_…
zhixiangli 5f6b5d7
test: parametrize test_mrd_concurrent_download for different chunk sizes
zhixiangli 139a145
chore: format tests/unit/asyncio/test_async_multi_range_downloader.py
zhixiangli e73e3a8
chore: format _stream_multiplexer.py and test_zonal.py
zhixiangli 43fe4cb
test: add Given/When/Then comments and cleanup test_stream_multiplexe…
zhixiangli 3c46d9c
test: cleanup redundant mock setups in test_async_multi_range_downloa…
zhixiangli ba9d5b5
chore: add logging to _stream_multiplexer.py recv loop failure
zhixiangli 2f9bf0e
test: add more assertions to AsyncMultiRangeDownloader creation test
zhixiangli be24e7d
test: refactor test_async_multi_range_downloader.py with Arrange/Act/…
zhixiangli 5e65760
test: refactor test_mrd_concurrent_download_out_of_bounds in test_zon…
zhixiangli 7c6fee2
fix: update stream end condition to use grpc.aio.EOF in _stream_multi…
zhixiangli 1f0b50b
fix: log warning for unregistered read_id in stream multiplexer
zhixiangli File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
207 changes: 207 additions & 0 deletions
207
packages/google-cloud-storage/google/cloud/storage/asyncio/_stream_multiplexer.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,207 @@ | ||
| # Copyright 2026 Google LLC | ||
| # | ||
| # Licensed under the Apache License, Version 2.0 (the "License"); | ||
| # you may not use this file except in compliance with the License. | ||
| # You may obtain a copy of the License at | ||
| # | ||
| # http://www.apache.org/licenses/LICENSE-2.0 | ||
| # | ||
| # Unless required by applicable law or agreed to in writing, software | ||
| # distributed under the License is distributed on an "AS IS" BASIS, | ||
| # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
| # See the License for the specific language governing permissions and | ||
| # limitations under the License. | ||
|
|
||
| from __future__ import annotations | ||
|
|
||
| import asyncio | ||
| import logging | ||
| from typing import Awaitable, Callable, Dict, Optional, Set | ||
|
|
||
| import grpc | ||
|
|
||
| from google.cloud import _storage_v2 | ||
| from google.cloud.storage.asyncio.async_read_object_stream import ( | ||
| _AsyncReadObjectStream, | ||
| ) | ||
|
|
||
| logger = logging.getLogger(__name__) | ||
|
|
||
| _DEFAULT_QUEUE_MAX_SIZE = 100 | ||
| _DEFAULT_PUT_TIMEOUT_SECONDS = 20.0 | ||
|
|
||
|
|
||
| class _StreamError: | ||
| """Wraps an error with the stream generation that produced it.""" | ||
|
|
||
| def __init__(self, exception: Exception, generation: int): | ||
| self.exception = exception | ||
| self.generation = generation | ||
|
|
||
|
|
||
| class _StreamEnd: | ||
| """Signals the stream closed normally.""" | ||
|
|
||
| pass | ||
|
|
||
|
|
||
| class _StreamMultiplexer: | ||
| """Multiplexes concurrent download tasks over a single bidi-gRPC stream. | ||
|
|
||
| Routes responses from a background recv loop to per-task asyncio.Queues | ||
| keyed by read_id. Coordinates stream reopening via generation-gated | ||
| locking. | ||
|
|
||
| A slow consumer on one task will slow down the entire shared connection | ||
| due to bounded queue backpressure propagating through gRPC flow control. | ||
| """ | ||
|
|
||
| def __init__( | ||
| self, | ||
| stream: _AsyncReadObjectStream, | ||
| queue_max_size: int = _DEFAULT_QUEUE_MAX_SIZE, | ||
| ): | ||
| self._stream = stream | ||
| self._stream_generation: int = 0 | ||
| self._queues: Dict[int, asyncio.Queue] = {} | ||
| self._reopen_lock = asyncio.Lock() | ||
| self._recv_task: Optional[asyncio.Task] = None | ||
| self._queue_max_size = queue_max_size | ||
|
|
||
| @property | ||
| def stream_generation(self) -> int: | ||
| return self._stream_generation | ||
|
|
||
| def register(self, read_ids: Set[int]) -> asyncio.Queue: | ||
| """Register read_ids for a task and return its response queue.""" | ||
| queue = asyncio.Queue(maxsize=self._queue_max_size) | ||
| for read_id in read_ids: | ||
| self._queues[read_id] = queue | ||
| return queue | ||
|
|
||
| def unregister(self, read_ids: Set[int]) -> None: | ||
| """Remove read_ids from routing.""" | ||
| for read_id in read_ids: | ||
| self._queues.pop(read_id, None) | ||
zhixiangli marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
|
||
| def _get_unique_queues(self) -> Set[asyncio.Queue]: | ||
| return set(self._queues.values()) | ||
|
|
||
| async def _put_with_timeout(self, queue: asyncio.Queue, item) -> None: | ||
| try: | ||
| await asyncio.wait_for( | ||
| queue.put(item), timeout=_DEFAULT_PUT_TIMEOUT_SECONDS | ||
| ) | ||
| except asyncio.TimeoutError: | ||
| if queue not in self._get_unique_queues(): | ||
| logger.debug("Dropped item for unregistered queue.") | ||
| else: | ||
| logger.warning( | ||
| "Queue full for too long. Dropping item to prevent multiplexer hang." | ||
| ) | ||
|
|
||
| def _ensure_recv_loop(self) -> None: | ||
| if self._recv_task is None or self._recv_task.done(): | ||
| self._recv_task = asyncio.create_task(self._recv_loop()) | ||
|
|
||
| def _stop_recv_loop(self) -> None: | ||
| if self._recv_task and not self._recv_task.done(): | ||
| self._recv_task.cancel() | ||
|
|
||
| def _put_error_nowait(self, queue: asyncio.Queue, error: _StreamError) -> None: | ||
| while True: | ||
| try: | ||
| queue.put_nowait(error) | ||
| break | ||
| except asyncio.QueueFull: | ||
| try: | ||
| queue.get_nowait() | ||
| except asyncio.QueueEmpty: | ||
| pass | ||
|
|
||
| async def _recv_loop(self) -> None: | ||
| try: | ||
| while True: | ||
| response = await self._stream.recv() | ||
| if response == grpc.aio.EOF: | ||
| sentinel = _StreamEnd() | ||
| await asyncio.gather( | ||
| *( | ||
| self._put_with_timeout(queue, sentinel) | ||
| for queue in self._get_unique_queues() | ||
| ) | ||
| ) | ||
| return | ||
|
|
||
| if response.object_data_ranges: | ||
| queues_to_notify: Set[asyncio.Queue] = set() | ||
| for data_range in response.object_data_ranges: | ||
| read_id = data_range.read_range.read_id | ||
| queue = self._queues.get(read_id) | ||
| if queue: | ||
zhixiangli marked this conversation as resolved.
Show resolved
Hide resolved
|
||
| queues_to_notify.add(queue) | ||
| else: | ||
| logger.warning( | ||
| f"Received data for unregistered read_id: {read_id}" | ||
| ) | ||
| await asyncio.gather( | ||
| *( | ||
| self._put_with_timeout(queue, response) | ||
| for queue in queues_to_notify | ||
| ) | ||
| ) | ||
| else: | ||
| await asyncio.gather( | ||
| *( | ||
| self._put_with_timeout(queue, response) | ||
| for queue in self._get_unique_queues() | ||
| ) | ||
| ) | ||
| except asyncio.CancelledError: | ||
| raise | ||
| except Exception as e: | ||
zhixiangli marked this conversation as resolved.
Show resolved
Hide resolved
|
||
| logger.warning(f"Stream multiplexer recv loop failed: {e}", exc_info=True) | ||
| error = _StreamError(e, self._stream_generation) | ||
| for queue in self._get_unique_queues(): | ||
| self._put_error_nowait(queue, error) | ||
|
|
||
| async def send(self, request: _storage_v2.BidiReadObjectRequest) -> int: | ||
| self._ensure_recv_loop() | ||
| await self._stream.send(request) | ||
| return self._stream_generation | ||
|
|
||
| async def reopen_stream( | ||
| self, | ||
| broken_generation: int, | ||
| stream_factory: Callable[[], Awaitable[_AsyncReadObjectStream]], | ||
| ) -> None: | ||
| async with self._reopen_lock: | ||
| if self._stream_generation != broken_generation: | ||
| return | ||
| self._stop_recv_loop() | ||
| if self._recv_task: | ||
| try: | ||
| await self._recv_task | ||
| except (asyncio.CancelledError, Exception): | ||
| pass | ||
| error = _StreamError(Exception("Stream reopening"), self._stream_generation) | ||
| for queue in self._get_unique_queues(): | ||
| self._put_error_nowait(queue, error) | ||
| try: | ||
| await self._stream.close() | ||
| except Exception: | ||
| pass | ||
| self._stream = await stream_factory() | ||
| self._stream_generation += 1 | ||
| self._ensure_recv_loop() | ||
|
|
||
| async def close(self) -> None: | ||
| self._stop_recv_loop() | ||
| if self._recv_task: | ||
| try: | ||
| await self._recv_task | ||
| except (asyncio.CancelledError, Exception): | ||
| pass | ||
| error = _StreamError(Exception("Multiplexer closed"), self._stream_generation) | ||
| for queue in self._get_unique_queues(): | ||
| self._put_error_nowait(queue, error) | ||
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.