mirror of
https://github.com/unshackle-dl/unshackle.git
synced 2025-10-23 15:11:08 +00:00
Compare commits
2 Commits
2.0.0
...
feature/re
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
984884858f | ||
|
|
bdd219d90c |
1
.gitignore
vendored
1
.gitignore
vendored
@@ -218,6 +218,7 @@ cython_debug/
|
|||||||
# you could uncomment the following to ignore the entire vscode folder
|
# you could uncomment the following to ignore the entire vscode folder
|
||||||
.vscode/
|
.vscode/
|
||||||
.github/copilot-instructions.md
|
.github/copilot-instructions.md
|
||||||
|
CLAUDE.md
|
||||||
|
|
||||||
# Ruff stuff:
|
# Ruff stuff:
|
||||||
.ruff_cache/
|
.ruff_cache/
|
||||||
|
|||||||
102
CHANGELOG.md
102
CHANGELOG.md
@@ -5,6 +5,108 @@ All notable changes to this project will be documented in this file.
|
|||||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
|
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
|
||||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||||
|
|
||||||
|
## [2.0.0] - 2025-10-25
|
||||||
|
|
||||||
|
### Breaking Changes
|
||||||
|
|
||||||
|
- **REST API Integration**: Core architecture modified to support REST API functionality
|
||||||
|
- Changes to internal APIs for download management and tracking
|
||||||
|
- Title and track classes updated with API integration points
|
||||||
|
- Core component interfaces modified for queue management support
|
||||||
|
- **Configuration Changes**: New required configuration options for API and enhanced features
|
||||||
|
- Added `simkl_client_id` now required for Simkl functionality
|
||||||
|
- Service-specific configuration override structure introduced
|
||||||
|
- Debug logging configuration options added
|
||||||
|
- **Forced Subtitles**: Behavior change for forced subtitle inclusion
|
||||||
|
- Forced subs no longer auto-included, requires explicit `--forced-subs` flag
|
||||||
|
|
||||||
|
### Added
|
||||||
|
|
||||||
|
- **REST API Server**: Complete download management via REST API (early development)
|
||||||
|
- Implemented download queue management and worker system
|
||||||
|
- Added OpenAPI/Swagger documentation for easy API exploration
|
||||||
|
- Included download progress tracking and status endpoints
|
||||||
|
- API authentication and comprehensive error handling
|
||||||
|
- Updated core components to support API integration
|
||||||
|
- Early development work with more changes planned
|
||||||
|
- **CustomRemoteCDM**: Highly configurable custom CDM API support
|
||||||
|
- Support for third-party CDM API providers with maximum configurability
|
||||||
|
- Full configuration through YAML without code changes
|
||||||
|
- Addresses GitHub issue #26 for flexible CDM integration
|
||||||
|
- **WindscribeVPN Proxy Provider**: New VPN provider support
|
||||||
|
- Added WindscribeVPN following NordVPN and SurfsharkVPN patterns
|
||||||
|
- Fixes GitHub issue #29
|
||||||
|
- **Latest Episode Download**: New `--latest-episode` CLI option
|
||||||
|
- `-le, --latest-episode` flag to download only the most recent episode
|
||||||
|
- Automatically selects the single most recent episode regardless of season
|
||||||
|
- Fixes GitHub issue #28
|
||||||
|
- **Service-Specific Configuration Overrides**: Per-service fine-tuned control
|
||||||
|
- Support for per-service configuration overrides in YAML
|
||||||
|
- Fine-tuned control of downloader and command options per service
|
||||||
|
- Fixes GitHub issue #13
|
||||||
|
- **Comprehensive JSON Debug Logging**: Structured logging for troubleshooting
|
||||||
|
- Binary toggle via `--debug` flag or `debug: true` in config
|
||||||
|
- JSON Lines (.jsonl) format for easy parsing and analysis
|
||||||
|
- Comprehensive logging of all operations (session info, CLI params, CDM details, auth status, title/track metadata, DRM operations, vault queries)
|
||||||
|
- Configurable key logging via `debug_keys` option with smart redaction
|
||||||
|
- Error logging for all critical operations
|
||||||
|
- Removed old text logging system
|
||||||
|
- **curl_cffi Retry Handler**: Enhanced session reliability
|
||||||
|
- Added automatic retry mechanism to curl_cffi Session
|
||||||
|
- Improved download reliability with configurable retries
|
||||||
|
- **Simkl API Configuration**: New API key support
|
||||||
|
- Added `simkl_client_id` configuration option
|
||||||
|
- Simkl now requires client_id from https://simkl.com/settings/developer/
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
|
||||||
|
- **Binary Search Enhancement**: Improved binary discovery
|
||||||
|
- Refactored to search for binaries in root of binary folder or subfolder named after the binary
|
||||||
|
- Better organization of binary dependencies
|
||||||
|
- **Type Annotations**: Modernized to PEP 604 syntax
|
||||||
|
- Updated session.py type annotations to use modern Python syntax
|
||||||
|
- Improved code readability and type checking
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
- **Config Directory Support**: Cross-platform user config directory support
|
||||||
|
- Fixed config loading to properly support user config directories across all platforms
|
||||||
|
- Fixes GitHub issue #23
|
||||||
|
- **HYBRID Mode Validation**: Pre-download validation for hybrid processing
|
||||||
|
- Added validation to check both HDR10 and DV tracks are available before download
|
||||||
|
- Prevents wasted downloads when hybrid processing would fail
|
||||||
|
- **TMDB/Simkl API Keys**: Graceful handling of missing API keys
|
||||||
|
- Improved error handling when TMDB or Simkl API keys are not configured
|
||||||
|
- Better user messaging for API configuration requirements
|
||||||
|
- **Forced Subtitles Behavior**: Correct forced subtitle filtering
|
||||||
|
- Fixed forced subtitles being incorrectly included without `--forced-subs` flag
|
||||||
|
- Forced subs now only included when explicitly requested
|
||||||
|
- **Font Attachment Constructor**: Fixed ASS/SSA font attachment
|
||||||
|
- Use keyword arguments for Attachment constructor in font attachment
|
||||||
|
- Fixes "Invalid URL: No scheme supplied" error
|
||||||
|
- Fixes GitHub issue #24
|
||||||
|
- **Binary Subdirectory Checking**: Enhanced binary location discovery (by @TPD94, PR #19)
|
||||||
|
- Updated binaries.py to check subdirectories in binaries folders named after the binary
|
||||||
|
- Improved binary detection and loading
|
||||||
|
- **HLS Manifest Processing**: Minor HLS parser fix (by @TPD94, PR #19)
|
||||||
|
- **lxml and pyplayready**: Updated dependencies (by @Sp5rky)
|
||||||
|
- Updated lxml constraint and pyplayready import path for compatibility
|
||||||
|
|
||||||
|
### Refactored
|
||||||
|
|
||||||
|
- **Import Cleanup**: Removed unused imports
|
||||||
|
- Removed unused mypy import from binaries.py
|
||||||
|
- Fixed import ordering in API download_manager and handlers
|
||||||
|
|
||||||
|
### Contributors
|
||||||
|
|
||||||
|
This release includes contributions from:
|
||||||
|
|
||||||
|
- @Sp5rky - REST API server implementation, dependency updates
|
||||||
|
- @stabbedbybrick - curl_cffi retry handler (PR #31)
|
||||||
|
- @TPD94 - Binary search enhancements, manifest parser fixes (PR #19)
|
||||||
|
- @scene (Andy) - Core features, configuration system, bug fixes
|
||||||
|
|
||||||
## [1.4.8] - 2025-10-08
|
## [1.4.8] - 2025-10-08
|
||||||
|
|
||||||
### Added
|
### Added
|
||||||
|
|||||||
@@ -34,6 +34,27 @@ def serve(host: str, port: int, caddy: bool, api_only: bool, no_key: bool) -> No
|
|||||||
\b
|
\b
|
||||||
The REST API provides programmatic access to unshackle functionality.
|
The REST API provides programmatic access to unshackle functionality.
|
||||||
Configure authentication in your config under serve.users and serve.api_secret.
|
Configure authentication in your config under serve.users and serve.api_secret.
|
||||||
|
|
||||||
|
\b
|
||||||
|
REMOTE SERVICES:
|
||||||
|
The server exposes endpoints that allow remote unshackle clients to use
|
||||||
|
your configured services without needing the service implementations.
|
||||||
|
Remote clients can authenticate, get titles/tracks, and receive session data
|
||||||
|
for downloading. Configure remote clients in unshackle.yaml:
|
||||||
|
|
||||||
|
\b
|
||||||
|
remote_services:
|
||||||
|
- url: "http://your-server:8786"
|
||||||
|
api_key: "your-api-key"
|
||||||
|
name: "my-server"
|
||||||
|
|
||||||
|
\b
|
||||||
|
Available remote endpoints:
|
||||||
|
- GET /api/remote/services - List available services
|
||||||
|
- POST /api/remote/{service}/search - Search for content
|
||||||
|
- POST /api/remote/{service}/titles - Get titles
|
||||||
|
- POST /api/remote/{service}/tracks - Get tracks
|
||||||
|
- POST /api/remote/{service}/chapters - Get chapters
|
||||||
"""
|
"""
|
||||||
from pywidevine import serve as pywidevine_serve
|
from pywidevine import serve as pywidevine_serve
|
||||||
|
|
||||||
|
|||||||
941
unshackle/core/api/remote_handlers.py
Normal file
941
unshackle/core/api/remote_handlers.py
Normal file
@@ -0,0 +1,941 @@
|
|||||||
|
"""API handlers for remote service functionality."""
|
||||||
|
|
||||||
|
import http.cookiejar
|
||||||
|
import inspect
|
||||||
|
import logging
|
||||||
|
import tempfile
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Any, Dict, Optional
|
||||||
|
|
||||||
|
import click
|
||||||
|
import yaml
|
||||||
|
from aiohttp import web
|
||||||
|
|
||||||
|
from unshackle.commands.dl import dl
|
||||||
|
from unshackle.core.api.handlers import (initialize_proxy_providers, resolve_proxy, serialize_audio_track,
|
||||||
|
serialize_subtitle_track, serialize_title, serialize_video_track,
|
||||||
|
validate_service)
|
||||||
|
from unshackle.core.api.session_serializer import serialize_session
|
||||||
|
from unshackle.core.config import config
|
||||||
|
from unshackle.core.credential import Credential
|
||||||
|
from unshackle.core.search_result import SearchResult
|
||||||
|
from unshackle.core.services import Services
|
||||||
|
from unshackle.core.titles import Episode
|
||||||
|
from unshackle.core.utils.click_types import ContextData
|
||||||
|
from unshackle.core.utils.collections import merge_dict
|
||||||
|
|
||||||
|
log = logging.getLogger("api.remote")
|
||||||
|
|
||||||
|
|
||||||
|
def load_cookies_from_content(cookies_content: Optional[str]) -> Optional[http.cookiejar.MozillaCookieJar]:
|
||||||
|
"""
|
||||||
|
Load cookies from raw cookie file content.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
cookies_content: Raw content of a Netscape/Mozilla format cookie file
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
MozillaCookieJar object or None
|
||||||
|
"""
|
||||||
|
if not cookies_content:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Write to temporary file
|
||||||
|
with tempfile.NamedTemporaryFile(mode="w", suffix=".txt", delete=False) as f:
|
||||||
|
f.write(cookies_content)
|
||||||
|
temp_path = f.name
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Load using standard cookie jar
|
||||||
|
cookie_jar = http.cookiejar.MozillaCookieJar(temp_path)
|
||||||
|
cookie_jar.load(ignore_discard=True, ignore_expires=True)
|
||||||
|
return cookie_jar
|
||||||
|
finally:
|
||||||
|
# Clean up temp file
|
||||||
|
Path(temp_path).unlink(missing_ok=True)
|
||||||
|
|
||||||
|
|
||||||
|
def create_credential_from_dict(cred_data: Optional[Dict[str, str]]) -> Optional[Credential]:
|
||||||
|
"""
|
||||||
|
Create a Credential object from dictionary.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
cred_data: Dictionary with 'username' and 'password' keys
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Credential object or None
|
||||||
|
"""
|
||||||
|
if not cred_data or "username" not in cred_data or "password" not in cred_data:
|
||||||
|
return None
|
||||||
|
|
||||||
|
return Credential(username=cred_data["username"], password=cred_data["password"])
|
||||||
|
|
||||||
|
|
||||||
|
def get_auth_from_request(data: Dict[str, Any], service_tag: str, profile: Optional[str] = None):
|
||||||
|
"""
|
||||||
|
Get authentication (cookies and credentials) from request data or fallback to server config.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
data: Request data
|
||||||
|
service_tag: Service tag
|
||||||
|
profile: Profile name
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Tuple of (cookies, credential)
|
||||||
|
"""
|
||||||
|
# Try to get from client request first
|
||||||
|
cookies_content = data.get("cookies")
|
||||||
|
credential_data = data.get("credential")
|
||||||
|
|
||||||
|
if cookies_content:
|
||||||
|
cookies = load_cookies_from_content(cookies_content)
|
||||||
|
else:
|
||||||
|
# Fallback to server-side cookies if not provided by client
|
||||||
|
cookies = dl.get_cookie_jar(service_tag, profile)
|
||||||
|
|
||||||
|
if credential_data:
|
||||||
|
credential = create_credential_from_dict(credential_data)
|
||||||
|
else:
|
||||||
|
# Fallback to server-side credentials if not provided by client
|
||||||
|
credential = dl.get_credentials(service_tag, profile)
|
||||||
|
|
||||||
|
return cookies, credential
|
||||||
|
|
||||||
|
|
||||||
|
async def remote_list_services(request: web.Request) -> web.Response:
|
||||||
|
"""
|
||||||
|
List all available services on this remote server.
|
||||||
|
---
|
||||||
|
summary: List remote services
|
||||||
|
description: Get all available services that can be accessed remotely
|
||||||
|
responses:
|
||||||
|
'200':
|
||||||
|
description: List of available services
|
||||||
|
content:
|
||||||
|
application/json:
|
||||||
|
schema:
|
||||||
|
type: object
|
||||||
|
properties:
|
||||||
|
status:
|
||||||
|
type: string
|
||||||
|
example: success
|
||||||
|
services:
|
||||||
|
type: array
|
||||||
|
items:
|
||||||
|
type: object
|
||||||
|
properties:
|
||||||
|
tag:
|
||||||
|
type: string
|
||||||
|
aliases:
|
||||||
|
type: array
|
||||||
|
items:
|
||||||
|
type: string
|
||||||
|
geofence:
|
||||||
|
type: array
|
||||||
|
items:
|
||||||
|
type: string
|
||||||
|
help:
|
||||||
|
type: string
|
||||||
|
'500':
|
||||||
|
description: Server error
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
service_tags = Services.get_tags()
|
||||||
|
services_info = []
|
||||||
|
|
||||||
|
for tag in service_tags:
|
||||||
|
service_data = {
|
||||||
|
"tag": tag,
|
||||||
|
"aliases": [],
|
||||||
|
"geofence": [],
|
||||||
|
"help": None,
|
||||||
|
}
|
||||||
|
|
||||||
|
try:
|
||||||
|
service_module = Services.load(tag)
|
||||||
|
|
||||||
|
if hasattr(service_module, "ALIASES"):
|
||||||
|
service_data["aliases"] = list(service_module.ALIASES)
|
||||||
|
|
||||||
|
if hasattr(service_module, "GEOFENCE"):
|
||||||
|
service_data["geofence"] = list(service_module.GEOFENCE)
|
||||||
|
|
||||||
|
if service_module.__doc__:
|
||||||
|
service_data["help"] = service_module.__doc__.strip()
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
log.warning(f"Could not load details for service {tag}: {e}")
|
||||||
|
|
||||||
|
services_info.append(service_data)
|
||||||
|
|
||||||
|
return web.json_response({"status": "success", "services": services_info})
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
log.exception("Error listing remote services")
|
||||||
|
return web.json_response({"status": "error", "message": str(e)}, status=500)
|
||||||
|
|
||||||
|
|
||||||
|
async def remote_search(request: web.Request) -> web.Response:
|
||||||
|
"""
|
||||||
|
Search for content on a remote service.
|
||||||
|
---
|
||||||
|
summary: Search remote service
|
||||||
|
description: Search for content using a remote service
|
||||||
|
parameters:
|
||||||
|
- name: service
|
||||||
|
in: path
|
||||||
|
required: true
|
||||||
|
schema:
|
||||||
|
type: string
|
||||||
|
requestBody:
|
||||||
|
required: true
|
||||||
|
content:
|
||||||
|
application/json:
|
||||||
|
schema:
|
||||||
|
type: object
|
||||||
|
required:
|
||||||
|
- query
|
||||||
|
properties:
|
||||||
|
query:
|
||||||
|
type: string
|
||||||
|
description: Search query
|
||||||
|
profile:
|
||||||
|
type: string
|
||||||
|
description: Profile to use for credentials
|
||||||
|
responses:
|
||||||
|
'200':
|
||||||
|
description: Search results
|
||||||
|
'400':
|
||||||
|
description: Invalid request
|
||||||
|
'500':
|
||||||
|
description: Server error
|
||||||
|
"""
|
||||||
|
service_tag = request.match_info.get("service")
|
||||||
|
|
||||||
|
try:
|
||||||
|
data = await request.json()
|
||||||
|
except Exception:
|
||||||
|
return web.json_response({"status": "error", "message": "Invalid JSON request body"}, status=400)
|
||||||
|
|
||||||
|
query = data.get("query")
|
||||||
|
if not query:
|
||||||
|
return web.json_response({"status": "error", "message": "Missing required parameter: query"}, status=400)
|
||||||
|
|
||||||
|
normalized_service = validate_service(service_tag)
|
||||||
|
if not normalized_service:
|
||||||
|
return web.json_response(
|
||||||
|
{"status": "error", "message": f"Invalid or unavailable service: {service_tag}"}, status=400
|
||||||
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
profile = data.get("profile")
|
||||||
|
|
||||||
|
service_config_path = Services.get_path(normalized_service) / config.filenames.config
|
||||||
|
if service_config_path.exists():
|
||||||
|
service_config = yaml.safe_load(service_config_path.read_text(encoding="utf8"))
|
||||||
|
else:
|
||||||
|
service_config = {}
|
||||||
|
merge_dict(config.services.get(normalized_service), service_config)
|
||||||
|
|
||||||
|
@click.command()
|
||||||
|
@click.pass_context
|
||||||
|
def dummy_service(ctx: click.Context) -> None:
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Handle proxy configuration
|
||||||
|
proxy_param = data.get("proxy")
|
||||||
|
no_proxy = data.get("no_proxy", False)
|
||||||
|
proxy_providers = []
|
||||||
|
|
||||||
|
if not no_proxy:
|
||||||
|
proxy_providers = initialize_proxy_providers()
|
||||||
|
|
||||||
|
if proxy_param and not no_proxy:
|
||||||
|
try:
|
||||||
|
resolved_proxy = resolve_proxy(proxy_param, proxy_providers)
|
||||||
|
proxy_param = resolved_proxy
|
||||||
|
except ValueError as e:
|
||||||
|
return web.json_response({"status": "error", "message": f"Proxy error: {e}"}, status=400)
|
||||||
|
|
||||||
|
ctx = click.Context(dummy_service)
|
||||||
|
ctx.obj = ContextData(config=service_config, cdm=None, proxy_providers=proxy_providers, profile=profile)
|
||||||
|
ctx.params = {"proxy": proxy_param, "no_proxy": no_proxy}
|
||||||
|
|
||||||
|
service_module = Services.load(normalized_service)
|
||||||
|
|
||||||
|
dummy_service.name = normalized_service
|
||||||
|
ctx.invoked_subcommand = normalized_service
|
||||||
|
|
||||||
|
service_ctx = click.Context(dummy_service, parent=ctx)
|
||||||
|
service_ctx.obj = ctx.obj
|
||||||
|
|
||||||
|
# Get service initialization parameters
|
||||||
|
service_init_params = inspect.signature(service_module.__init__).parameters
|
||||||
|
service_kwargs = {}
|
||||||
|
|
||||||
|
# Extract defaults from click command
|
||||||
|
if hasattr(service_module, "cli") and hasattr(service_module.cli, "params"):
|
||||||
|
for param in service_module.cli.params:
|
||||||
|
if hasattr(param, "name") and param.name not in service_kwargs:
|
||||||
|
if hasattr(param, "default") and param.default is not None:
|
||||||
|
service_kwargs[param.name] = param.default
|
||||||
|
|
||||||
|
# Add query parameter
|
||||||
|
if "query" in service_init_params:
|
||||||
|
service_kwargs["query"] = query
|
||||||
|
|
||||||
|
# Filter to only valid parameters
|
||||||
|
filtered_kwargs = {k: v for k, v in service_kwargs.items() if k in service_init_params}
|
||||||
|
|
||||||
|
service_instance = service_module(service_ctx, **filtered_kwargs)
|
||||||
|
|
||||||
|
# Authenticate with client-provided or server-side auth
|
||||||
|
cookies, credential = get_auth_from_request(data, normalized_service, profile)
|
||||||
|
service_instance.authenticate(cookies, credential)
|
||||||
|
|
||||||
|
# Perform search
|
||||||
|
search_results = []
|
||||||
|
if hasattr(service_instance, "search"):
|
||||||
|
for result in service_instance.search():
|
||||||
|
if isinstance(result, SearchResult):
|
||||||
|
search_results.append(
|
||||||
|
{
|
||||||
|
"id": str(result.id_),
|
||||||
|
"title": result.title,
|
||||||
|
"description": result.description,
|
||||||
|
"label": result.label,
|
||||||
|
"url": result.url,
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
# Serialize session data
|
||||||
|
session_data = serialize_session(service_instance.session)
|
||||||
|
|
||||||
|
return web.json_response({"status": "success", "results": search_results, "session": session_data})
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
log.exception("Error performing remote search")
|
||||||
|
return web.json_response({"status": "error", "message": str(e)}, status=500)
|
||||||
|
|
||||||
|
|
||||||
|
async def remote_get_titles(request: web.Request) -> web.Response:
|
||||||
|
"""
|
||||||
|
Get titles from a remote service.
|
||||||
|
---
|
||||||
|
summary: Get titles from remote service
|
||||||
|
description: Get available titles for content from a remote service
|
||||||
|
parameters:
|
||||||
|
- name: service
|
||||||
|
in: path
|
||||||
|
required: true
|
||||||
|
schema:
|
||||||
|
type: string
|
||||||
|
requestBody:
|
||||||
|
required: true
|
||||||
|
content:
|
||||||
|
application/json:
|
||||||
|
schema:
|
||||||
|
type: object
|
||||||
|
required:
|
||||||
|
- title
|
||||||
|
properties:
|
||||||
|
title:
|
||||||
|
type: string
|
||||||
|
description: Title identifier, URL, or any format accepted by the service
|
||||||
|
profile:
|
||||||
|
type: string
|
||||||
|
description: Profile to use for credentials
|
||||||
|
proxy:
|
||||||
|
type: string
|
||||||
|
description: Proxy region code (e.g., "ca", "us") or full proxy URL - uses server's proxy configuration
|
||||||
|
no_proxy:
|
||||||
|
type: boolean
|
||||||
|
description: Disable proxy usage
|
||||||
|
cookies:
|
||||||
|
type: string
|
||||||
|
description: Raw Netscape/Mozilla format cookie file content (optional - uses server cookies if not provided)
|
||||||
|
credential:
|
||||||
|
type: object
|
||||||
|
description: Credentials object with username and password (optional - uses server credentials if not provided)
|
||||||
|
properties:
|
||||||
|
username:
|
||||||
|
type: string
|
||||||
|
password:
|
||||||
|
type: string
|
||||||
|
responses:
|
||||||
|
'200':
|
||||||
|
description: Titles and session data
|
||||||
|
'400':
|
||||||
|
description: Invalid request
|
||||||
|
'500':
|
||||||
|
description: Server error
|
||||||
|
"""
|
||||||
|
service_tag = request.match_info.get("service")
|
||||||
|
|
||||||
|
try:
|
||||||
|
data = await request.json()
|
||||||
|
except Exception:
|
||||||
|
return web.json_response({"status": "error", "message": "Invalid JSON request body"}, status=400)
|
||||||
|
|
||||||
|
# Accept 'title', 'title_id', or 'url' for flexibility
|
||||||
|
title = data.get("title") or data.get("title_id") or data.get("url")
|
||||||
|
if not title:
|
||||||
|
return web.json_response(
|
||||||
|
{
|
||||||
|
"status": "error",
|
||||||
|
"message": "Missing required parameter: title (can be URL, ID, or any format accepted by the service)",
|
||||||
|
},
|
||||||
|
status=400,
|
||||||
|
)
|
||||||
|
|
||||||
|
normalized_service = validate_service(service_tag)
|
||||||
|
if not normalized_service:
|
||||||
|
return web.json_response(
|
||||||
|
{"status": "error", "message": f"Invalid or unavailable service: {service_tag}"}, status=400
|
||||||
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
profile = data.get("profile")
|
||||||
|
|
||||||
|
service_config_path = Services.get_path(normalized_service) / config.filenames.config
|
||||||
|
if service_config_path.exists():
|
||||||
|
service_config = yaml.safe_load(service_config_path.read_text(encoding="utf8"))
|
||||||
|
else:
|
||||||
|
service_config = {}
|
||||||
|
merge_dict(config.services.get(normalized_service), service_config)
|
||||||
|
|
||||||
|
@click.command()
|
||||||
|
@click.pass_context
|
||||||
|
def dummy_service(ctx: click.Context) -> None:
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Handle proxy configuration
|
||||||
|
proxy_param = data.get("proxy")
|
||||||
|
no_proxy = data.get("no_proxy", False)
|
||||||
|
proxy_providers = []
|
||||||
|
|
||||||
|
if not no_proxy:
|
||||||
|
proxy_providers = initialize_proxy_providers()
|
||||||
|
|
||||||
|
if proxy_param and not no_proxy:
|
||||||
|
try:
|
||||||
|
resolved_proxy = resolve_proxy(proxy_param, proxy_providers)
|
||||||
|
proxy_param = resolved_proxy
|
||||||
|
except ValueError as e:
|
||||||
|
return web.json_response({"status": "error", "message": f"Proxy error: {e}"}, status=400)
|
||||||
|
|
||||||
|
ctx = click.Context(dummy_service)
|
||||||
|
ctx.obj = ContextData(config=service_config, cdm=None, proxy_providers=proxy_providers, profile=profile)
|
||||||
|
ctx.params = {"proxy": proxy_param, "no_proxy": no_proxy}
|
||||||
|
|
||||||
|
service_module = Services.load(normalized_service)
|
||||||
|
|
||||||
|
dummy_service.name = normalized_service
|
||||||
|
dummy_service.params = [click.Argument([title], type=str)]
|
||||||
|
ctx.invoked_subcommand = normalized_service
|
||||||
|
|
||||||
|
service_ctx = click.Context(dummy_service, parent=ctx)
|
||||||
|
service_ctx.obj = ctx.obj
|
||||||
|
|
||||||
|
service_kwargs = {"title": title}
|
||||||
|
|
||||||
|
# Add additional parameters from request data
|
||||||
|
for key, value in data.items():
|
||||||
|
if key not in ["title", "title_id", "url", "profile", "proxy", "no_proxy"]:
|
||||||
|
service_kwargs[key] = value
|
||||||
|
|
||||||
|
# Get service parameter info and click command defaults
|
||||||
|
service_init_params = inspect.signature(service_module.__init__).parameters
|
||||||
|
|
||||||
|
# Extract default values from the click command
|
||||||
|
if hasattr(service_module, "cli") and hasattr(service_module.cli, "params"):
|
||||||
|
for param in service_module.cli.params:
|
||||||
|
if hasattr(param, "name") and param.name not in service_kwargs:
|
||||||
|
if hasattr(param, "default") and param.default is not None:
|
||||||
|
service_kwargs[param.name] = param.default
|
||||||
|
|
||||||
|
# Handle required parameters
|
||||||
|
for param_name, param_info in service_init_params.items():
|
||||||
|
if param_name not in service_kwargs and param_name not in ["self", "ctx"]:
|
||||||
|
if param_info.default is inspect.Parameter.empty:
|
||||||
|
if param_name == "meta_lang":
|
||||||
|
service_kwargs[param_name] = None
|
||||||
|
elif param_name == "movie":
|
||||||
|
service_kwargs[param_name] = False
|
||||||
|
|
||||||
|
# Filter to only valid parameters
|
||||||
|
filtered_kwargs = {k: v for k, v in service_kwargs.items() if k in service_init_params}
|
||||||
|
|
||||||
|
service_instance = service_module(service_ctx, **filtered_kwargs)
|
||||||
|
|
||||||
|
# Authenticate with client-provided or server-side auth
|
||||||
|
cookies, credential = get_auth_from_request(data, normalized_service, profile)
|
||||||
|
service_instance.authenticate(cookies, credential)
|
||||||
|
|
||||||
|
# Get titles
|
||||||
|
titles = service_instance.get_titles()
|
||||||
|
|
||||||
|
if hasattr(titles, "__iter__") and not isinstance(titles, str):
|
||||||
|
title_list = [serialize_title(t) for t in titles]
|
||||||
|
else:
|
||||||
|
title_list = [serialize_title(titles)]
|
||||||
|
|
||||||
|
# Serialize session data
|
||||||
|
session_data = serialize_session(service_instance.session)
|
||||||
|
|
||||||
|
return web.json_response({"status": "success", "titles": title_list, "session": session_data})
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
log.exception("Error getting remote titles")
|
||||||
|
return web.json_response({"status": "error", "message": str(e)}, status=500)
|
||||||
|
|
||||||
|
|
||||||
|
async def remote_get_tracks(request: web.Request) -> web.Response:
|
||||||
|
"""
|
||||||
|
Get tracks from a remote service.
|
||||||
|
---
|
||||||
|
summary: Get tracks from remote service
|
||||||
|
description: Get available tracks for a title from a remote service
|
||||||
|
parameters:
|
||||||
|
- name: service
|
||||||
|
in: path
|
||||||
|
required: true
|
||||||
|
schema:
|
||||||
|
type: string
|
||||||
|
requestBody:
|
||||||
|
required: true
|
||||||
|
content:
|
||||||
|
application/json:
|
||||||
|
schema:
|
||||||
|
type: object
|
||||||
|
required:
|
||||||
|
- title
|
||||||
|
properties:
|
||||||
|
title:
|
||||||
|
type: string
|
||||||
|
description: Title identifier, URL, or any format accepted by the service
|
||||||
|
wanted:
|
||||||
|
type: string
|
||||||
|
description: Specific episodes/seasons
|
||||||
|
profile:
|
||||||
|
type: string
|
||||||
|
description: Profile to use for credentials
|
||||||
|
proxy:
|
||||||
|
type: string
|
||||||
|
description: Proxy region code (e.g., "ca", "us") or full proxy URL - uses server's proxy configuration
|
||||||
|
no_proxy:
|
||||||
|
type: boolean
|
||||||
|
description: Disable proxy usage
|
||||||
|
cookies:
|
||||||
|
type: string
|
||||||
|
description: Raw Netscape/Mozilla format cookie file content (optional - uses server cookies if not provided)
|
||||||
|
credential:
|
||||||
|
type: object
|
||||||
|
description: Credentials object with username and password (optional - uses server credentials if not provided)
|
||||||
|
properties:
|
||||||
|
username:
|
||||||
|
type: string
|
||||||
|
password:
|
||||||
|
type: string
|
||||||
|
responses:
|
||||||
|
'200':
|
||||||
|
description: Tracks and session data
|
||||||
|
'400':
|
||||||
|
description: Invalid request
|
||||||
|
'500':
|
||||||
|
description: Server error
|
||||||
|
"""
|
||||||
|
service_tag = request.match_info.get("service")
|
||||||
|
|
||||||
|
try:
|
||||||
|
data = await request.json()
|
||||||
|
except Exception:
|
||||||
|
return web.json_response({"status": "error", "message": "Invalid JSON request body"}, status=400)
|
||||||
|
|
||||||
|
# Accept 'title', 'title_id', or 'url' for flexibility
|
||||||
|
title = data.get("title") or data.get("title_id") or data.get("url")
|
||||||
|
if not title:
|
||||||
|
return web.json_response(
|
||||||
|
{
|
||||||
|
"status": "error",
|
||||||
|
"message": "Missing required parameter: title (can be URL, ID, or any format accepted by the service)",
|
||||||
|
},
|
||||||
|
status=400,
|
||||||
|
)
|
||||||
|
|
||||||
|
normalized_service = validate_service(service_tag)
|
||||||
|
if not normalized_service:
|
||||||
|
return web.json_response(
|
||||||
|
{"status": "error", "message": f"Invalid or unavailable service: {service_tag}"}, status=400
|
||||||
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
profile = data.get("profile")
|
||||||
|
|
||||||
|
service_config_path = Services.get_path(normalized_service) / config.filenames.config
|
||||||
|
if service_config_path.exists():
|
||||||
|
service_config = yaml.safe_load(service_config_path.read_text(encoding="utf8"))
|
||||||
|
else:
|
||||||
|
service_config = {}
|
||||||
|
merge_dict(config.services.get(normalized_service), service_config)
|
||||||
|
|
||||||
|
@click.command()
|
||||||
|
@click.pass_context
|
||||||
|
def dummy_service(ctx: click.Context) -> None:
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Handle proxy configuration
|
||||||
|
proxy_param = data.get("proxy")
|
||||||
|
no_proxy = data.get("no_proxy", False)
|
||||||
|
proxy_providers = []
|
||||||
|
|
||||||
|
if not no_proxy:
|
||||||
|
proxy_providers = initialize_proxy_providers()
|
||||||
|
|
||||||
|
if proxy_param and not no_proxy:
|
||||||
|
try:
|
||||||
|
resolved_proxy = resolve_proxy(proxy_param, proxy_providers)
|
||||||
|
proxy_param = resolved_proxy
|
||||||
|
except ValueError as e:
|
||||||
|
return web.json_response({"status": "error", "message": f"Proxy error: {e}"}, status=400)
|
||||||
|
|
||||||
|
ctx = click.Context(dummy_service)
|
||||||
|
ctx.obj = ContextData(config=service_config, cdm=None, proxy_providers=proxy_providers, profile=profile)
|
||||||
|
ctx.params = {"proxy": proxy_param, "no_proxy": no_proxy}
|
||||||
|
|
||||||
|
service_module = Services.load(normalized_service)
|
||||||
|
|
||||||
|
dummy_service.name = normalized_service
|
||||||
|
dummy_service.params = [click.Argument([title], type=str)]
|
||||||
|
ctx.invoked_subcommand = normalized_service
|
||||||
|
|
||||||
|
service_ctx = click.Context(dummy_service, parent=ctx)
|
||||||
|
service_ctx.obj = ctx.obj
|
||||||
|
|
||||||
|
service_kwargs = {"title": title}
|
||||||
|
|
||||||
|
# Add additional parameters
|
||||||
|
for key, value in data.items():
|
||||||
|
if key not in ["title", "title_id", "url", "profile", "wanted", "season", "episode", "proxy", "no_proxy"]:
|
||||||
|
service_kwargs[key] = value
|
||||||
|
|
||||||
|
# Get service parameters
|
||||||
|
service_init_params = inspect.signature(service_module.__init__).parameters
|
||||||
|
|
||||||
|
# Extract defaults from click command
|
||||||
|
if hasattr(service_module, "cli") and hasattr(service_module.cli, "params"):
|
||||||
|
for param in service_module.cli.params:
|
||||||
|
if hasattr(param, "name") and param.name not in service_kwargs:
|
||||||
|
if hasattr(param, "default") and param.default is not None:
|
||||||
|
service_kwargs[param.name] = param.default
|
||||||
|
|
||||||
|
# Handle required parameters
|
||||||
|
for param_name, param_info in service_init_params.items():
|
||||||
|
if param_name not in service_kwargs and param_name not in ["self", "ctx"]:
|
||||||
|
if param_info.default is inspect.Parameter.empty:
|
||||||
|
if param_name == "meta_lang":
|
||||||
|
service_kwargs[param_name] = None
|
||||||
|
elif param_name == "movie":
|
||||||
|
service_kwargs[param_name] = False
|
||||||
|
|
||||||
|
# Filter to valid parameters
|
||||||
|
filtered_kwargs = {k: v for k, v in service_kwargs.items() if k in service_init_params}
|
||||||
|
|
||||||
|
service_instance = service_module(service_ctx, **filtered_kwargs)
|
||||||
|
|
||||||
|
# Authenticate with client-provided or server-side auth
|
||||||
|
cookies, credential = get_auth_from_request(data, normalized_service, profile)
|
||||||
|
service_instance.authenticate(cookies, credential)
|
||||||
|
|
||||||
|
# Get titles
|
||||||
|
titles = service_instance.get_titles()
|
||||||
|
|
||||||
|
wanted_param = data.get("wanted")
|
||||||
|
season = data.get("season")
|
||||||
|
episode = data.get("episode")
|
||||||
|
|
||||||
|
if hasattr(titles, "__iter__") and not isinstance(titles, str):
|
||||||
|
titles_list = list(titles)
|
||||||
|
|
||||||
|
wanted = None
|
||||||
|
if wanted_param:
|
||||||
|
from unshackle.core.utils.click_types import SeasonRange
|
||||||
|
|
||||||
|
try:
|
||||||
|
season_range = SeasonRange()
|
||||||
|
wanted = season_range.parse_tokens(wanted_param)
|
||||||
|
except Exception as e:
|
||||||
|
return web.json_response(
|
||||||
|
{"status": "error", "message": f"Invalid wanted parameter: {e}"}, status=400
|
||||||
|
)
|
||||||
|
elif season is not None and episode is not None:
|
||||||
|
wanted = [f"{season}x{episode}"]
|
||||||
|
|
||||||
|
if wanted:
|
||||||
|
matching_titles = []
|
||||||
|
for title in titles_list:
|
||||||
|
if isinstance(title, Episode):
|
||||||
|
episode_key = f"{title.season}x{title.number}"
|
||||||
|
if episode_key in wanted:
|
||||||
|
matching_titles.append(title)
|
||||||
|
else:
|
||||||
|
matching_titles.append(title)
|
||||||
|
|
||||||
|
if not matching_titles:
|
||||||
|
return web.json_response(
|
||||||
|
{"status": "error", "message": "No episodes found matching wanted criteria"}, status=404
|
||||||
|
)
|
||||||
|
|
||||||
|
# Handle multiple episodes
|
||||||
|
if len(matching_titles) > 1 and all(isinstance(t, Episode) for t in matching_titles):
|
||||||
|
episodes_data = []
|
||||||
|
failed_episodes = []
|
||||||
|
|
||||||
|
sorted_titles = sorted(matching_titles, key=lambda t: (t.season, t.number))
|
||||||
|
|
||||||
|
for title in sorted_titles:
|
||||||
|
try:
|
||||||
|
tracks = service_instance.get_tracks(title)
|
||||||
|
video_tracks = sorted(tracks.videos, key=lambda t: t.bitrate or 0, reverse=True)
|
||||||
|
audio_tracks = sorted(tracks.audio, key=lambda t: t.bitrate or 0, reverse=True)
|
||||||
|
|
||||||
|
episode_data = {
|
||||||
|
"title": serialize_title(title),
|
||||||
|
"video": [serialize_video_track(t) for t in video_tracks],
|
||||||
|
"audio": [serialize_audio_track(t) for t in audio_tracks],
|
||||||
|
"subtitles": [serialize_subtitle_track(t) for t in tracks.subtitles],
|
||||||
|
}
|
||||||
|
episodes_data.append(episode_data)
|
||||||
|
except (SystemExit, Exception):
|
||||||
|
failed_episodes.append(f"S{title.season}E{title.number:02d}")
|
||||||
|
continue
|
||||||
|
|
||||||
|
if episodes_data:
|
||||||
|
session_data = serialize_session(service_instance.session)
|
||||||
|
response = {"status": "success", "episodes": episodes_data, "session": session_data}
|
||||||
|
if failed_episodes:
|
||||||
|
response["unavailable_episodes"] = failed_episodes
|
||||||
|
return web.json_response(response)
|
||||||
|
else:
|
||||||
|
return web.json_response(
|
||||||
|
{
|
||||||
|
"status": "error",
|
||||||
|
"message": f"No available episodes. Unavailable: {', '.join(failed_episodes)}",
|
||||||
|
},
|
||||||
|
status=404,
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
first_title = matching_titles[0]
|
||||||
|
else:
|
||||||
|
first_title = titles_list[0]
|
||||||
|
else:
|
||||||
|
first_title = titles
|
||||||
|
|
||||||
|
# Get tracks for single title
|
||||||
|
tracks = service_instance.get_tracks(first_title)
|
||||||
|
|
||||||
|
video_tracks = sorted(tracks.videos, key=lambda t: t.bitrate or 0, reverse=True)
|
||||||
|
audio_tracks = sorted(tracks.audio, key=lambda t: t.bitrate or 0, reverse=True)
|
||||||
|
|
||||||
|
# Serialize session data
|
||||||
|
session_data = serialize_session(service_instance.session)
|
||||||
|
|
||||||
|
response_data = {
|
||||||
|
"status": "success",
|
||||||
|
"title": serialize_title(first_title),
|
||||||
|
"video": [serialize_video_track(t) for t in video_tracks],
|
||||||
|
"audio": [serialize_audio_track(t) for t in audio_tracks],
|
||||||
|
"subtitles": [serialize_subtitle_track(t) for t in tracks.subtitles],
|
||||||
|
"session": session_data,
|
||||||
|
}
|
||||||
|
|
||||||
|
return web.json_response(response_data)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
log.exception("Error getting remote tracks")
|
||||||
|
return web.json_response({"status": "error", "message": str(e)}, status=500)
|
||||||
|
|
||||||
|
|
||||||
|
async def remote_get_chapters(request: web.Request) -> web.Response:
|
||||||
|
"""
|
||||||
|
Get chapters from a remote service.
|
||||||
|
---
|
||||||
|
summary: Get chapters from remote service
|
||||||
|
description: Get available chapters for a title from a remote service
|
||||||
|
parameters:
|
||||||
|
- name: service
|
||||||
|
in: path
|
||||||
|
required: true
|
||||||
|
schema:
|
||||||
|
type: string
|
||||||
|
requestBody:
|
||||||
|
required: true
|
||||||
|
content:
|
||||||
|
application/json:
|
||||||
|
schema:
|
||||||
|
type: object
|
||||||
|
required:
|
||||||
|
- title
|
||||||
|
properties:
|
||||||
|
title:
|
||||||
|
type: string
|
||||||
|
description: Title identifier, URL, or any format accepted by the service
|
||||||
|
profile:
|
||||||
|
type: string
|
||||||
|
description: Profile to use for credentials
|
||||||
|
proxy:
|
||||||
|
type: string
|
||||||
|
description: Proxy region code (e.g., "ca", "us") or full proxy URL - uses server's proxy configuration
|
||||||
|
no_proxy:
|
||||||
|
type: boolean
|
||||||
|
description: Disable proxy usage
|
||||||
|
cookies:
|
||||||
|
type: string
|
||||||
|
description: Raw Netscape/Mozilla format cookie file content (optional - uses server cookies if not provided)
|
||||||
|
credential:
|
||||||
|
type: object
|
||||||
|
description: Credentials object with username and password (optional - uses server credentials if not provided)
|
||||||
|
properties:
|
||||||
|
username:
|
||||||
|
type: string
|
||||||
|
password:
|
||||||
|
type: string
|
||||||
|
responses:
|
||||||
|
'200':
|
||||||
|
description: Chapters and session data
|
||||||
|
'400':
|
||||||
|
description: Invalid request
|
||||||
|
'500':
|
||||||
|
description: Server error
|
||||||
|
"""
|
||||||
|
service_tag = request.match_info.get("service")
|
||||||
|
|
||||||
|
try:
|
||||||
|
data = await request.json()
|
||||||
|
except Exception:
|
||||||
|
return web.json_response({"status": "error", "message": "Invalid JSON request body"}, status=400)
|
||||||
|
|
||||||
|
# Accept 'title', 'title_id', or 'url' for flexibility
|
||||||
|
title = data.get("title") or data.get("title_id") or data.get("url")
|
||||||
|
if not title:
|
||||||
|
return web.json_response(
|
||||||
|
{
|
||||||
|
"status": "error",
|
||||||
|
"message": "Missing required parameter: title (can be URL, ID, or any format accepted by the service)",
|
||||||
|
},
|
||||||
|
status=400,
|
||||||
|
)
|
||||||
|
|
||||||
|
normalized_service = validate_service(service_tag)
|
||||||
|
if not normalized_service:
|
||||||
|
return web.json_response(
|
||||||
|
{"status": "error", "message": f"Invalid or unavailable service: {service_tag}"}, status=400
|
||||||
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
profile = data.get("profile")
|
||||||
|
|
||||||
|
service_config_path = Services.get_path(normalized_service) / config.filenames.config
|
||||||
|
if service_config_path.exists():
|
||||||
|
service_config = yaml.safe_load(service_config_path.read_text(encoding="utf8"))
|
||||||
|
else:
|
||||||
|
service_config = {}
|
||||||
|
merge_dict(config.services.get(normalized_service), service_config)
|
||||||
|
|
||||||
|
@click.command()
|
||||||
|
@click.pass_context
|
||||||
|
def dummy_service(ctx: click.Context) -> None:
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Handle proxy configuration
|
||||||
|
proxy_param = data.get("proxy")
|
||||||
|
no_proxy = data.get("no_proxy", False)
|
||||||
|
proxy_providers = []
|
||||||
|
|
||||||
|
if not no_proxy:
|
||||||
|
proxy_providers = initialize_proxy_providers()
|
||||||
|
|
||||||
|
if proxy_param and not no_proxy:
|
||||||
|
try:
|
||||||
|
resolved_proxy = resolve_proxy(proxy_param, proxy_providers)
|
||||||
|
proxy_param = resolved_proxy
|
||||||
|
except ValueError as e:
|
||||||
|
return web.json_response({"status": "error", "message": f"Proxy error: {e}"}, status=400)
|
||||||
|
|
||||||
|
ctx = click.Context(dummy_service)
|
||||||
|
ctx.obj = ContextData(config=service_config, cdm=None, proxy_providers=proxy_providers, profile=profile)
|
||||||
|
ctx.params = {"proxy": proxy_param, "no_proxy": no_proxy}
|
||||||
|
|
||||||
|
service_module = Services.load(normalized_service)
|
||||||
|
|
||||||
|
dummy_service.name = normalized_service
|
||||||
|
dummy_service.params = [click.Argument([title], type=str)]
|
||||||
|
ctx.invoked_subcommand = normalized_service
|
||||||
|
|
||||||
|
service_ctx = click.Context(dummy_service, parent=ctx)
|
||||||
|
service_ctx.obj = ctx.obj
|
||||||
|
|
||||||
|
service_kwargs = {"title": title}
|
||||||
|
|
||||||
|
# Add additional parameters
|
||||||
|
for key, value in data.items():
|
||||||
|
if key not in ["title", "title_id", "url", "profile", "proxy", "no_proxy"]:
|
||||||
|
service_kwargs[key] = value
|
||||||
|
|
||||||
|
# Get service parameters
|
||||||
|
service_init_params = inspect.signature(service_module.__init__).parameters
|
||||||
|
|
||||||
|
# Extract defaults
|
||||||
|
if hasattr(service_module, "cli") and hasattr(service_module.cli, "params"):
|
||||||
|
for param in service_module.cli.params:
|
||||||
|
if hasattr(param, "name") and param.name not in service_kwargs:
|
||||||
|
if hasattr(param, "default") and param.default is not None:
|
||||||
|
service_kwargs[param.name] = param.default
|
||||||
|
|
||||||
|
# Handle required parameters
|
||||||
|
for param_name, param_info in service_init_params.items():
|
||||||
|
if param_name not in service_kwargs and param_name not in ["self", "ctx"]:
|
||||||
|
if param_info.default is inspect.Parameter.empty:
|
||||||
|
if param_name == "meta_lang":
|
||||||
|
service_kwargs[param_name] = None
|
||||||
|
elif param_name == "movie":
|
||||||
|
service_kwargs[param_name] = False
|
||||||
|
|
||||||
|
# Filter to valid parameters
|
||||||
|
filtered_kwargs = {k: v for k, v in service_kwargs.items() if k in service_init_params}
|
||||||
|
|
||||||
|
service_instance = service_module(service_ctx, **filtered_kwargs)
|
||||||
|
|
||||||
|
# Authenticate with client-provided or server-side auth
|
||||||
|
cookies, credential = get_auth_from_request(data, normalized_service, profile)
|
||||||
|
service_instance.authenticate(cookies, credential)
|
||||||
|
|
||||||
|
# Get titles
|
||||||
|
titles = service_instance.get_titles()
|
||||||
|
|
||||||
|
if hasattr(titles, "__iter__") and not isinstance(titles, str):
|
||||||
|
first_title = list(titles)[0]
|
||||||
|
else:
|
||||||
|
first_title = titles
|
||||||
|
|
||||||
|
# Get chapters if service supports it
|
||||||
|
chapters_data = []
|
||||||
|
if hasattr(service_instance, "get_chapters"):
|
||||||
|
chapters = service_instance.get_chapters(first_title)
|
||||||
|
if chapters:
|
||||||
|
for chapter in chapters:
|
||||||
|
chapters_data.append(
|
||||||
|
{
|
||||||
|
"timestamp": chapter.timestamp,
|
||||||
|
"name": chapter.name if hasattr(chapter, "name") else None,
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
# Serialize session data
|
||||||
|
session_data = serialize_session(service_instance.session)
|
||||||
|
|
||||||
|
return web.json_response({"status": "success", "chapters": chapters_data, "session": session_data})
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
log.exception("Error getting remote chapters")
|
||||||
|
return web.json_response({"status": "error", "message": str(e)}, status=500)
|
||||||
@@ -6,6 +6,8 @@ from aiohttp_swagger3 import SwaggerDocs, SwaggerInfo, SwaggerUiSettings
|
|||||||
from unshackle.core import __version__
|
from unshackle.core import __version__
|
||||||
from unshackle.core.api.handlers import (cancel_download_job_handler, download_handler, get_download_job_handler,
|
from unshackle.core.api.handlers import (cancel_download_job_handler, download_handler, get_download_job_handler,
|
||||||
list_download_jobs_handler, list_titles_handler, list_tracks_handler)
|
list_download_jobs_handler, list_titles_handler, list_tracks_handler)
|
||||||
|
from unshackle.core.api.remote_handlers import (remote_get_chapters, remote_get_titles, remote_get_tracks,
|
||||||
|
remote_list_services, remote_search)
|
||||||
from unshackle.core.services import Services
|
from unshackle.core.services import Services
|
||||||
from unshackle.core.update_checker import UpdateChecker
|
from unshackle.core.update_checker import UpdateChecker
|
||||||
|
|
||||||
@@ -360,6 +362,13 @@ def setup_routes(app: web.Application) -> None:
|
|||||||
app.router.add_get("/api/download/jobs/{job_id}", download_job_detail)
|
app.router.add_get("/api/download/jobs/{job_id}", download_job_detail)
|
||||||
app.router.add_delete("/api/download/jobs/{job_id}", cancel_download_job)
|
app.router.add_delete("/api/download/jobs/{job_id}", cancel_download_job)
|
||||||
|
|
||||||
|
# Remote service endpoints
|
||||||
|
app.router.add_get("/api/remote/services", remote_list_services)
|
||||||
|
app.router.add_post("/api/remote/{service}/search", remote_search)
|
||||||
|
app.router.add_post("/api/remote/{service}/titles", remote_get_titles)
|
||||||
|
app.router.add_post("/api/remote/{service}/tracks", remote_get_tracks)
|
||||||
|
app.router.add_post("/api/remote/{service}/chapters", remote_get_chapters)
|
||||||
|
|
||||||
|
|
||||||
def setup_swagger(app: web.Application) -> None:
|
def setup_swagger(app: web.Application) -> None:
|
||||||
"""Setup Swagger UI documentation."""
|
"""Setup Swagger UI documentation."""
|
||||||
@@ -384,5 +393,11 @@ def setup_swagger(app: web.Application) -> None:
|
|||||||
web.get("/api/download/jobs", download_jobs),
|
web.get("/api/download/jobs", download_jobs),
|
||||||
web.get("/api/download/jobs/{job_id}", download_job_detail),
|
web.get("/api/download/jobs/{job_id}", download_job_detail),
|
||||||
web.delete("/api/download/jobs/{job_id}", cancel_download_job),
|
web.delete("/api/download/jobs/{job_id}", cancel_download_job),
|
||||||
|
# Remote service routes
|
||||||
|
web.get("/api/remote/services", remote_list_services),
|
||||||
|
web.post("/api/remote/{service}/search", remote_search),
|
||||||
|
web.post("/api/remote/{service}/titles", remote_get_titles),
|
||||||
|
web.post("/api/remote/{service}/tracks", remote_get_tracks),
|
||||||
|
web.post("/api/remote/{service}/chapters", remote_get_chapters),
|
||||||
]
|
]
|
||||||
)
|
)
|
||||||
|
|||||||
236
unshackle/core/api/session_serializer.py
Normal file
236
unshackle/core/api/session_serializer.py
Normal file
@@ -0,0 +1,236 @@
|
|||||||
|
"""Session serialization helpers for remote services."""
|
||||||
|
|
||||||
|
from http.cookiejar import CookieJar
|
||||||
|
from typing import Any, Dict, Optional
|
||||||
|
|
||||||
|
import requests
|
||||||
|
|
||||||
|
from unshackle.core.credential import Credential
|
||||||
|
|
||||||
|
|
||||||
|
def serialize_session(session: requests.Session) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Serialize a requests.Session into a JSON-serializable dictionary.
|
||||||
|
|
||||||
|
Extracts cookies, headers, and other session data that can be
|
||||||
|
transferred to a remote client for downloading.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
session: The requests.Session to serialize
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary containing serialized session data
|
||||||
|
"""
|
||||||
|
session_data = {
|
||||||
|
"cookies": {},
|
||||||
|
"headers": {},
|
||||||
|
"proxies": session.proxies.copy() if session.proxies else {},
|
||||||
|
}
|
||||||
|
|
||||||
|
# Serialize cookies
|
||||||
|
if session.cookies:
|
||||||
|
for cookie in session.cookies:
|
||||||
|
session_data["cookies"][cookie.name] = {
|
||||||
|
"value": cookie.value,
|
||||||
|
"domain": cookie.domain,
|
||||||
|
"path": cookie.path,
|
||||||
|
"secure": cookie.secure,
|
||||||
|
"expires": cookie.expires,
|
||||||
|
}
|
||||||
|
|
||||||
|
# Serialize headers (exclude proxy-authorization for security)
|
||||||
|
if session.headers:
|
||||||
|
for key, value in session.headers.items():
|
||||||
|
# Skip proxy-related headers as they're server-specific
|
||||||
|
if key.lower() not in ["proxy-authorization"]:
|
||||||
|
session_data["headers"][key] = value
|
||||||
|
|
||||||
|
return session_data
|
||||||
|
|
||||||
|
|
||||||
|
def deserialize_session(
|
||||||
|
session_data: Dict[str, Any], target_session: Optional[requests.Session] = None
|
||||||
|
) -> requests.Session:
|
||||||
|
"""
|
||||||
|
Deserialize session data into a requests.Session.
|
||||||
|
|
||||||
|
Applies cookies, headers, and other session data from a remote server
|
||||||
|
to a local session for downloading.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
session_data: Dictionary containing serialized session data
|
||||||
|
target_session: Optional existing session to update (creates new if None)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
requests.Session with applied session data
|
||||||
|
"""
|
||||||
|
if target_session is None:
|
||||||
|
target_session = requests.Session()
|
||||||
|
|
||||||
|
# Apply cookies
|
||||||
|
if "cookies" in session_data:
|
||||||
|
for cookie_name, cookie_data in session_data["cookies"].items():
|
||||||
|
target_session.cookies.set(
|
||||||
|
name=cookie_name,
|
||||||
|
value=cookie_data["value"],
|
||||||
|
domain=cookie_data.get("domain"),
|
||||||
|
path=cookie_data.get("path", "/"),
|
||||||
|
secure=cookie_data.get("secure", False),
|
||||||
|
expires=cookie_data.get("expires"),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Apply headers
|
||||||
|
if "headers" in session_data:
|
||||||
|
target_session.headers.update(session_data["headers"])
|
||||||
|
|
||||||
|
# Note: We don't apply proxies from remote as the local client
|
||||||
|
# should use its own proxy configuration
|
||||||
|
|
||||||
|
return target_session
|
||||||
|
|
||||||
|
|
||||||
|
def extract_session_tokens(session: requests.Session) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Extract authentication tokens and similar data from a session.
|
||||||
|
|
||||||
|
Looks for common authentication patterns like Bearer tokens,
|
||||||
|
API keys in headers, etc.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
session: The requests.Session to extract tokens from
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary containing extracted tokens
|
||||||
|
"""
|
||||||
|
tokens = {}
|
||||||
|
|
||||||
|
# Check for Authorization header
|
||||||
|
if "Authorization" in session.headers:
|
||||||
|
tokens["authorization"] = session.headers["Authorization"]
|
||||||
|
|
||||||
|
# Check for common API key headers
|
||||||
|
for key in ["X-API-Key", "Api-Key", "X-Auth-Token"]:
|
||||||
|
if key in session.headers:
|
||||||
|
tokens[key.lower().replace("-", "_")] = session.headers[key]
|
||||||
|
|
||||||
|
return tokens
|
||||||
|
|
||||||
|
|
||||||
|
def apply_session_tokens(tokens: Dict[str, Any], target_session: requests.Session) -> None:
|
||||||
|
"""
|
||||||
|
Apply authentication tokens to a session.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
tokens: Dictionary containing tokens to apply
|
||||||
|
target_session: Session to apply tokens to
|
||||||
|
"""
|
||||||
|
# Apply Authorization header
|
||||||
|
if "authorization" in tokens:
|
||||||
|
target_session.headers["Authorization"] = tokens["authorization"]
|
||||||
|
|
||||||
|
# Apply other token headers
|
||||||
|
token_header_map = {
|
||||||
|
"x_api_key": "X-API-Key",
|
||||||
|
"api_key": "Api-Key",
|
||||||
|
"x_auth_token": "X-Auth-Token",
|
||||||
|
}
|
||||||
|
|
||||||
|
for token_key, header_name in token_header_map.items():
|
||||||
|
if token_key in tokens:
|
||||||
|
target_session.headers[header_name] = tokens[token_key]
|
||||||
|
|
||||||
|
|
||||||
|
def serialize_cookies(cookie_jar: Optional[CookieJar]) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Serialize a CookieJar into a JSON-serializable dictionary.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
cookie_jar: The CookieJar to serialize
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary containing serialized cookies
|
||||||
|
"""
|
||||||
|
if not cookie_jar:
|
||||||
|
return {}
|
||||||
|
|
||||||
|
cookies = {}
|
||||||
|
for cookie in cookie_jar:
|
||||||
|
cookies[cookie.name] = {
|
||||||
|
"value": cookie.value,
|
||||||
|
"domain": cookie.domain,
|
||||||
|
"path": cookie.path,
|
||||||
|
"secure": cookie.secure,
|
||||||
|
"expires": cookie.expires,
|
||||||
|
}
|
||||||
|
|
||||||
|
return cookies
|
||||||
|
|
||||||
|
|
||||||
|
def deserialize_cookies(cookies_data: Dict[str, Any]) -> CookieJar:
|
||||||
|
"""
|
||||||
|
Deserialize cookies into a CookieJar.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
cookies_data: Dictionary containing serialized cookies
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
CookieJar with cookies
|
||||||
|
"""
|
||||||
|
import http.cookiejar
|
||||||
|
|
||||||
|
cookie_jar = http.cookiejar.CookieJar()
|
||||||
|
|
||||||
|
for cookie_name, cookie_data in cookies_data.items():
|
||||||
|
cookie = http.cookiejar.Cookie(
|
||||||
|
version=0,
|
||||||
|
name=cookie_name,
|
||||||
|
value=cookie_data["value"],
|
||||||
|
port=None,
|
||||||
|
port_specified=False,
|
||||||
|
domain=cookie_data.get("domain", ""),
|
||||||
|
domain_specified=bool(cookie_data.get("domain")),
|
||||||
|
domain_initial_dot=cookie_data.get("domain", "").startswith("."),
|
||||||
|
path=cookie_data.get("path", "/"),
|
||||||
|
path_specified=True,
|
||||||
|
secure=cookie_data.get("secure", False),
|
||||||
|
expires=cookie_data.get("expires"),
|
||||||
|
discard=False,
|
||||||
|
comment=None,
|
||||||
|
comment_url=None,
|
||||||
|
rest={},
|
||||||
|
)
|
||||||
|
cookie_jar.set_cookie(cookie)
|
||||||
|
|
||||||
|
return cookie_jar
|
||||||
|
|
||||||
|
|
||||||
|
def serialize_credential(credential: Optional[Credential]) -> Optional[Dict[str, str]]:
|
||||||
|
"""
|
||||||
|
Serialize a Credential into a JSON-serializable dictionary.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
credential: The Credential to serialize
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary containing username and password, or None
|
||||||
|
"""
|
||||||
|
if not credential:
|
||||||
|
return None
|
||||||
|
|
||||||
|
return {"username": credential.username, "password": credential.password}
|
||||||
|
|
||||||
|
|
||||||
|
def deserialize_credential(credential_data: Optional[Dict[str, str]]) -> Optional[Credential]:
|
||||||
|
"""
|
||||||
|
Deserialize credential data into a Credential object.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
credential_data: Dictionary containing username and password
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Credential object or None
|
||||||
|
"""
|
||||||
|
if not credential_data:
|
||||||
|
return None
|
||||||
|
|
||||||
|
return Credential(username=credential_data["username"], password=credential_data["password"])
|
||||||
@@ -103,6 +103,8 @@ class Config:
|
|||||||
self.debug: bool = kwargs.get("debug", False)
|
self.debug: bool = kwargs.get("debug", False)
|
||||||
self.debug_keys: bool = kwargs.get("debug_keys", False)
|
self.debug_keys: bool = kwargs.get("debug_keys", False)
|
||||||
|
|
||||||
|
self.remote_services: list[dict] = kwargs.get("remote_services") or []
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def from_yaml(cls, path: Path) -> Config:
|
def from_yaml(cls, path: Path) -> Config:
|
||||||
if not path.exists():
|
if not path.exists():
|
||||||
|
|||||||
427
unshackle/core/remote_service.py
Normal file
427
unshackle/core/remote_service.py
Normal file
@@ -0,0 +1,427 @@
|
|||||||
|
"""Remote service implementation for connecting to remote unshackle servers."""
|
||||||
|
|
||||||
|
import logging
|
||||||
|
from collections.abc import Generator
|
||||||
|
from http.cookiejar import CookieJar
|
||||||
|
from typing import Any, Dict, Optional, Union
|
||||||
|
|
||||||
|
import click
|
||||||
|
import requests
|
||||||
|
from rich.padding import Padding
|
||||||
|
from rich.rule import Rule
|
||||||
|
|
||||||
|
from unshackle.core.api.session_serializer import deserialize_session
|
||||||
|
from unshackle.core.console import console
|
||||||
|
from unshackle.core.credential import Credential
|
||||||
|
from unshackle.core.search_result import SearchResult
|
||||||
|
from unshackle.core.titles import Episode, Movie, Movies, Series
|
||||||
|
from unshackle.core.tracks import Chapter, Chapters, Tracks
|
||||||
|
from unshackle.core.tracks.audio import Audio
|
||||||
|
from unshackle.core.tracks.subtitle import Subtitle
|
||||||
|
from unshackle.core.tracks.video import Video
|
||||||
|
|
||||||
|
|
||||||
|
class RemoteService:
|
||||||
|
"""
|
||||||
|
Remote Service wrapper that connects to a remote unshackle server.
|
||||||
|
|
||||||
|
This class mimics the Service interface but delegates all operations
|
||||||
|
to a remote unshackle server via API calls. It receives session data
|
||||||
|
from the remote server which is then used locally for downloading.
|
||||||
|
"""
|
||||||
|
|
||||||
|
ALIASES: tuple[str, ...] = ()
|
||||||
|
GEOFENCE: tuple[str, ...] = ()
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
ctx: click.Context,
|
||||||
|
remote_url: str,
|
||||||
|
api_key: str,
|
||||||
|
service_tag: str,
|
||||||
|
service_metadata: Dict[str, Any],
|
||||||
|
**kwargs,
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Initialize remote service.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
ctx: Click context
|
||||||
|
remote_url: Base URL of the remote unshackle server
|
||||||
|
api_key: API key for authentication
|
||||||
|
service_tag: The service tag on the remote server (e.g., "DSNP")
|
||||||
|
service_metadata: Metadata about the service from remote discovery
|
||||||
|
**kwargs: Additional service-specific parameters
|
||||||
|
"""
|
||||||
|
console.print(Padding(Rule(f"[rule.text]Remote Service: {service_tag}"), (1, 2)))
|
||||||
|
|
||||||
|
self.log = logging.getLogger(f"RemoteService.{service_tag}")
|
||||||
|
self.remote_url = remote_url.rstrip("/")
|
||||||
|
self.api_key = api_key
|
||||||
|
self.service_tag = service_tag
|
||||||
|
self.service_metadata = service_metadata
|
||||||
|
self.ctx = ctx
|
||||||
|
self.kwargs = kwargs
|
||||||
|
|
||||||
|
# Set GEOFENCE and ALIASES from metadata
|
||||||
|
if "geofence" in service_metadata:
|
||||||
|
self.GEOFENCE = tuple(service_metadata["geofence"])
|
||||||
|
if "aliases" in service_metadata:
|
||||||
|
self.ALIASES = tuple(service_metadata["aliases"])
|
||||||
|
|
||||||
|
# Create a session for API calls to the remote server
|
||||||
|
self.api_session = requests.Session()
|
||||||
|
self.api_session.headers.update({"X-API-Key": self.api_key, "Content-Type": "application/json"})
|
||||||
|
|
||||||
|
# This session will receive data from remote for actual downloading
|
||||||
|
self.session = requests.Session()
|
||||||
|
|
||||||
|
# Store authentication state
|
||||||
|
self.authenticated = False
|
||||||
|
self.credential = None
|
||||||
|
self.cookies_content = None # Raw cookie file content to send to remote
|
||||||
|
|
||||||
|
def _make_request(self, endpoint: str, data: Optional[Dict[str, Any]] = None) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Make an API request to the remote server.
|
||||||
|
|
||||||
|
Automatically includes cookies and credentials in the request.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
endpoint: API endpoint path (e.g., "/api/remote/DSNP/titles")
|
||||||
|
data: Optional JSON data to send
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Response JSON data
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
ConnectionError: If the request fails
|
||||||
|
"""
|
||||||
|
url = f"{self.remote_url}{endpoint}"
|
||||||
|
|
||||||
|
# Ensure data is a dictionary
|
||||||
|
if data is None:
|
||||||
|
data = {}
|
||||||
|
|
||||||
|
# Add cookies and credentials to request if available
|
||||||
|
if self.cookies_content:
|
||||||
|
data["cookies"] = self.cookies_content
|
||||||
|
|
||||||
|
if self.credential:
|
||||||
|
data["credential"] = {"username": self.credential.username, "password": self.credential.password}
|
||||||
|
|
||||||
|
try:
|
||||||
|
if data:
|
||||||
|
response = self.api_session.post(url, json=data)
|
||||||
|
else:
|
||||||
|
response = self.api_session.get(url)
|
||||||
|
|
||||||
|
response.raise_for_status()
|
||||||
|
result = response.json()
|
||||||
|
|
||||||
|
# Apply session data if present
|
||||||
|
if "session" in result:
|
||||||
|
deserialize_session(result["session"], self.session)
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
except requests.RequestException as e:
|
||||||
|
self.log.error(f"Remote API request failed: {e}")
|
||||||
|
raise ConnectionError(f"Failed to communicate with remote server: {e}")
|
||||||
|
|
||||||
|
def authenticate(self, cookies: Optional[CookieJar] = None, credential: Optional[Credential] = None) -> None:
|
||||||
|
"""
|
||||||
|
Prepare authentication data to send to remote service.
|
||||||
|
|
||||||
|
Stores cookies and credentials to send with each API request.
|
||||||
|
The remote server will use these for authentication.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
cookies: Cookie jar from local configuration
|
||||||
|
credential: Credentials from local configuration
|
||||||
|
"""
|
||||||
|
self.log.info("Preparing authentication for remote server...")
|
||||||
|
self.credential = credential
|
||||||
|
|
||||||
|
# Read cookies file content if cookies provided
|
||||||
|
if cookies and hasattr(cookies, "filename") and cookies.filename:
|
||||||
|
try:
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
cookie_file = Path(cookies.filename)
|
||||||
|
if cookie_file.exists():
|
||||||
|
self.cookies_content = cookie_file.read_text()
|
||||||
|
self.log.info(f"Loaded cookies from {cookie_file}")
|
||||||
|
except Exception as e:
|
||||||
|
self.log.warning(f"Could not read cookie file: {e}")
|
||||||
|
|
||||||
|
self.authenticated = True
|
||||||
|
self.log.info("Authentication data ready for remote server")
|
||||||
|
|
||||||
|
def search(self, query: Optional[str] = None) -> Generator[SearchResult, None, None]:
|
||||||
|
"""
|
||||||
|
Search for content on the remote service.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
query: Search query string
|
||||||
|
|
||||||
|
Yields:
|
||||||
|
SearchResult objects
|
||||||
|
"""
|
||||||
|
if query is None:
|
||||||
|
query = self.kwargs.get("query", "")
|
||||||
|
|
||||||
|
self.log.info(f"Searching remote service for: {query}")
|
||||||
|
|
||||||
|
data = {"query": query}
|
||||||
|
|
||||||
|
# Add any additional parameters
|
||||||
|
if hasattr(self.ctx, "params"):
|
||||||
|
if self.ctx.params.get("proxy"):
|
||||||
|
data["proxy"] = self.ctx.params["proxy"]
|
||||||
|
if self.ctx.params.get("no_proxy"):
|
||||||
|
data["no_proxy"] = True
|
||||||
|
|
||||||
|
response = self._make_request(f"/api/remote/{self.service_tag}/search", data)
|
||||||
|
|
||||||
|
if response.get("status") == "success" and "results" in response:
|
||||||
|
for result in response["results"]:
|
||||||
|
yield SearchResult(
|
||||||
|
id_=result["id"],
|
||||||
|
title=result["title"],
|
||||||
|
description=result.get("description"),
|
||||||
|
label=result.get("label"),
|
||||||
|
url=result.get("url"),
|
||||||
|
)
|
||||||
|
|
||||||
|
def get_titles(self) -> Union[Movies, Series]:
|
||||||
|
"""
|
||||||
|
Get titles from the remote service.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Movies or Series object containing title information
|
||||||
|
"""
|
||||||
|
title = self.kwargs.get("title")
|
||||||
|
|
||||||
|
if not title:
|
||||||
|
raise ValueError("No title provided")
|
||||||
|
|
||||||
|
self.log.info(f"Getting titles from remote service for: {title}")
|
||||||
|
|
||||||
|
data = {"title": title}
|
||||||
|
|
||||||
|
# Add additional parameters
|
||||||
|
for key, value in self.kwargs.items():
|
||||||
|
if key not in ["title"]:
|
||||||
|
data[key] = value
|
||||||
|
|
||||||
|
# Add context parameters
|
||||||
|
if hasattr(self.ctx, "params"):
|
||||||
|
if self.ctx.params.get("proxy"):
|
||||||
|
data["proxy"] = self.ctx.params["proxy"]
|
||||||
|
if self.ctx.params.get("no_proxy"):
|
||||||
|
data["no_proxy"] = True
|
||||||
|
|
||||||
|
response = self._make_request(f"/api/remote/{self.service_tag}/titles", data)
|
||||||
|
|
||||||
|
if response.get("status") != "success" or "titles" not in response:
|
||||||
|
raise ValueError(f"Failed to get titles from remote: {response.get('message', 'Unknown error')}")
|
||||||
|
|
||||||
|
titles_data = response["titles"]
|
||||||
|
|
||||||
|
# Deserialize titles
|
||||||
|
titles = []
|
||||||
|
for title_info in titles_data:
|
||||||
|
if title_info["type"] == "movie":
|
||||||
|
titles.append(
|
||||||
|
Movie(
|
||||||
|
id_=title_info.get("id", title),
|
||||||
|
service=self.__class__,
|
||||||
|
name=title_info["name"],
|
||||||
|
year=title_info.get("year"),
|
||||||
|
data=title_info,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
elif title_info["type"] == "episode":
|
||||||
|
titles.append(
|
||||||
|
Episode(
|
||||||
|
id_=title_info.get("id", title),
|
||||||
|
service=self.__class__,
|
||||||
|
title=title_info.get("series_title", title_info["name"]),
|
||||||
|
season=title_info.get("season", 0),
|
||||||
|
number=title_info.get("number", 0),
|
||||||
|
name=title_info.get("name"),
|
||||||
|
year=title_info.get("year"),
|
||||||
|
data=title_info,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
# Return appropriate container
|
||||||
|
if titles and isinstance(titles[0], Episode):
|
||||||
|
return Series(titles)
|
||||||
|
else:
|
||||||
|
return Movies(titles)
|
||||||
|
|
||||||
|
def get_tracks(self, title: Union[Movie, Episode]) -> Tracks:
|
||||||
|
"""
|
||||||
|
Get tracks from the remote service.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
title: Title object to get tracks for
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Tracks object containing video, audio, and subtitle tracks
|
||||||
|
"""
|
||||||
|
self.log.info(f"Getting tracks from remote service for: {title}")
|
||||||
|
|
||||||
|
title_input = self.kwargs.get("title")
|
||||||
|
data = {"title": title_input}
|
||||||
|
|
||||||
|
# Add episode information if applicable
|
||||||
|
if isinstance(title, Episode):
|
||||||
|
data["season"] = title.season
|
||||||
|
data["episode"] = title.number
|
||||||
|
|
||||||
|
# Add additional parameters
|
||||||
|
for key, value in self.kwargs.items():
|
||||||
|
if key not in ["title"]:
|
||||||
|
data[key] = value
|
||||||
|
|
||||||
|
# Add context parameters
|
||||||
|
if hasattr(self.ctx, "params"):
|
||||||
|
if self.ctx.params.get("proxy"):
|
||||||
|
data["proxy"] = self.ctx.params["proxy"]
|
||||||
|
if self.ctx.params.get("no_proxy"):
|
||||||
|
data["no_proxy"] = True
|
||||||
|
|
||||||
|
response = self._make_request(f"/api/remote/{self.service_tag}/tracks", data)
|
||||||
|
|
||||||
|
if response.get("status") != "success":
|
||||||
|
raise ValueError(f"Failed to get tracks from remote: {response.get('message', 'Unknown error')}")
|
||||||
|
|
||||||
|
# Handle multiple episodes response
|
||||||
|
if "episodes" in response:
|
||||||
|
# For multiple episodes, return tracks for the matching title
|
||||||
|
for episode_data in response["episodes"]:
|
||||||
|
episode_title = episode_data["title"]
|
||||||
|
if (
|
||||||
|
isinstance(title, Episode)
|
||||||
|
and episode_title.get("season") == title.season
|
||||||
|
and episode_title.get("number") == title.number
|
||||||
|
):
|
||||||
|
return self._deserialize_tracks(episode_data, title)
|
||||||
|
|
||||||
|
raise ValueError(f"Could not find tracks for {title.season}x{title.number} in remote response")
|
||||||
|
|
||||||
|
# Single title response
|
||||||
|
return self._deserialize_tracks(response, title)
|
||||||
|
|
||||||
|
def _deserialize_tracks(self, data: Dict[str, Any], title: Union[Movie, Episode]) -> Tracks:
|
||||||
|
"""
|
||||||
|
Deserialize tracks from API response.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
data: Track data from API
|
||||||
|
title: Title object these tracks belong to
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Tracks object
|
||||||
|
"""
|
||||||
|
tracks = Tracks()
|
||||||
|
|
||||||
|
# Deserialize video tracks
|
||||||
|
for video_data in data.get("video", []):
|
||||||
|
video = Video(
|
||||||
|
id_=video_data["id"],
|
||||||
|
url="", # URL will be populated during download from manifests
|
||||||
|
codec=Video.Codec[video_data["codec"]],
|
||||||
|
bitrate=video_data.get("bitrate", 0) * 1000 if video_data.get("bitrate") else None,
|
||||||
|
width=video_data.get("width"),
|
||||||
|
height=video_data.get("height"),
|
||||||
|
fps=video_data.get("fps"),
|
||||||
|
range_=Video.Range[video_data["range"]] if video_data.get("range") else None,
|
||||||
|
language=video_data.get("language"),
|
||||||
|
drm=video_data.get("drm"),
|
||||||
|
)
|
||||||
|
tracks.add(video)
|
||||||
|
|
||||||
|
# Deserialize audio tracks
|
||||||
|
for audio_data in data.get("audio", []):
|
||||||
|
audio = Audio(
|
||||||
|
id_=audio_data["id"],
|
||||||
|
url="", # URL will be populated during download
|
||||||
|
codec=Audio.Codec[audio_data["codec"]],
|
||||||
|
bitrate=audio_data.get("bitrate", 0) * 1000 if audio_data.get("bitrate") else None,
|
||||||
|
channels=audio_data.get("channels"),
|
||||||
|
language=audio_data.get("language"),
|
||||||
|
descriptive=audio_data.get("descriptive", False),
|
||||||
|
drm=audio_data.get("drm"),
|
||||||
|
)
|
||||||
|
if audio_data.get("atmos"):
|
||||||
|
audio.atmos = True
|
||||||
|
tracks.add(audio)
|
||||||
|
|
||||||
|
# Deserialize subtitle tracks
|
||||||
|
for subtitle_data in data.get("subtitles", []):
|
||||||
|
subtitle = Subtitle(
|
||||||
|
id_=subtitle_data["id"],
|
||||||
|
url="", # URL will be populated during download
|
||||||
|
codec=Subtitle.Codec[subtitle_data["codec"]],
|
||||||
|
language=subtitle_data.get("language"),
|
||||||
|
forced=subtitle_data.get("forced", False),
|
||||||
|
sdh=subtitle_data.get("sdh", False),
|
||||||
|
cc=subtitle_data.get("cc", False),
|
||||||
|
)
|
||||||
|
tracks.add(subtitle)
|
||||||
|
|
||||||
|
return tracks
|
||||||
|
|
||||||
|
def get_chapters(self, title: Union[Movie, Episode]) -> Chapters:
|
||||||
|
"""
|
||||||
|
Get chapters from the remote service.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
title: Title object to get chapters for
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Chapters object
|
||||||
|
"""
|
||||||
|
self.log.info(f"Getting chapters from remote service for: {title}")
|
||||||
|
|
||||||
|
title_input = self.kwargs.get("title")
|
||||||
|
data = {"title": title_input}
|
||||||
|
|
||||||
|
# Add episode information if applicable
|
||||||
|
if isinstance(title, Episode):
|
||||||
|
data["season"] = title.season
|
||||||
|
data["episode"] = title.number
|
||||||
|
|
||||||
|
# Add context parameters
|
||||||
|
if hasattr(self.ctx, "params"):
|
||||||
|
if self.ctx.params.get("proxy"):
|
||||||
|
data["proxy"] = self.ctx.params["proxy"]
|
||||||
|
if self.ctx.params.get("no_proxy"):
|
||||||
|
data["no_proxy"] = True
|
||||||
|
|
||||||
|
response = self._make_request(f"/api/remote/{self.service_tag}/chapters", data)
|
||||||
|
|
||||||
|
if response.get("status") != "success":
|
||||||
|
self.log.warning(f"Failed to get chapters from remote: {response.get('message', 'Unknown error')}")
|
||||||
|
return Chapters()
|
||||||
|
|
||||||
|
chapters = Chapters()
|
||||||
|
for chapter_data in response.get("chapters", []):
|
||||||
|
chapters.add(Chapter(timestamp=chapter_data["timestamp"], name=chapter_data.get("name")))
|
||||||
|
|
||||||
|
return chapters
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def get_session() -> requests.Session:
|
||||||
|
"""
|
||||||
|
Create a session for the remote service.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
A requests.Session object
|
||||||
|
"""
|
||||||
|
session = requests.Session()
|
||||||
|
return session
|
||||||
245
unshackle/core/remote_services.py
Normal file
245
unshackle/core/remote_services.py
Normal file
@@ -0,0 +1,245 @@
|
|||||||
|
"""Remote service discovery and management."""
|
||||||
|
|
||||||
|
import logging
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Any, Dict, List, Optional
|
||||||
|
|
||||||
|
import requests
|
||||||
|
|
||||||
|
from unshackle.core.config import config
|
||||||
|
from unshackle.core.remote_service import RemoteService
|
||||||
|
|
||||||
|
log = logging.getLogger("RemoteServices")
|
||||||
|
|
||||||
|
|
||||||
|
class RemoteServiceManager:
|
||||||
|
"""
|
||||||
|
Manages discovery and registration of remote services.
|
||||||
|
|
||||||
|
This class connects to configured remote unshackle servers,
|
||||||
|
discovers available services, and creates RemoteService instances
|
||||||
|
that can be used like local services.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
"""Initialize the remote service manager."""
|
||||||
|
self.remote_services: Dict[str, type] = {}
|
||||||
|
self.remote_configs: List[Dict[str, Any]] = []
|
||||||
|
|
||||||
|
def discover_services(self) -> None:
|
||||||
|
"""
|
||||||
|
Discover services from all configured remote servers.
|
||||||
|
|
||||||
|
Reads the remote_services configuration, connects to each server,
|
||||||
|
retrieves available services, and creates RemoteService classes
|
||||||
|
for each discovered service.
|
||||||
|
"""
|
||||||
|
if not config.remote_services:
|
||||||
|
log.debug("No remote services configured")
|
||||||
|
return
|
||||||
|
|
||||||
|
log.info(f"Discovering services from {len(config.remote_services)} remote server(s)...")
|
||||||
|
|
||||||
|
for remote_config in config.remote_services:
|
||||||
|
try:
|
||||||
|
self._discover_from_server(remote_config)
|
||||||
|
except Exception as e:
|
||||||
|
log.error(f"Failed to discover services from {remote_config.get('url')}: {e}")
|
||||||
|
continue
|
||||||
|
|
||||||
|
log.info(f"Discovered {len(self.remote_services)} remote service(s)")
|
||||||
|
|
||||||
|
def _discover_from_server(self, remote_config: Dict[str, Any]) -> None:
|
||||||
|
"""
|
||||||
|
Discover services from a single remote server.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
remote_config: Configuration for the remote server
|
||||||
|
(must contain 'url' and 'api_key')
|
||||||
|
"""
|
||||||
|
url = remote_config.get("url", "").rstrip("/")
|
||||||
|
api_key = remote_config.get("api_key", "")
|
||||||
|
server_name = remote_config.get("name", url)
|
||||||
|
|
||||||
|
if not url:
|
||||||
|
log.warning("Remote service configuration missing 'url', skipping")
|
||||||
|
return
|
||||||
|
|
||||||
|
if not api_key:
|
||||||
|
log.warning(f"Remote service {url} missing 'api_key', skipping")
|
||||||
|
return
|
||||||
|
|
||||||
|
log.info(f"Connecting to remote server: {server_name}")
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Query the remote server for available services
|
||||||
|
response = requests.get(
|
||||||
|
f"{url}/api/remote/services",
|
||||||
|
headers={"X-API-Key": api_key, "Content-Type": "application/json"},
|
||||||
|
timeout=10,
|
||||||
|
)
|
||||||
|
|
||||||
|
response.raise_for_status()
|
||||||
|
data = response.json()
|
||||||
|
|
||||||
|
if data.get("status") != "success" or "services" not in data:
|
||||||
|
log.error(f"Invalid response from {url}: {data}")
|
||||||
|
return
|
||||||
|
|
||||||
|
services = data["services"]
|
||||||
|
log.info(f"Found {len(services)} service(s) on {server_name}")
|
||||||
|
|
||||||
|
# Create RemoteService classes for each service
|
||||||
|
for service_info in services:
|
||||||
|
self._register_remote_service(url, api_key, service_info, server_name)
|
||||||
|
|
||||||
|
except requests.RequestException as e:
|
||||||
|
log.error(f"Failed to connect to remote server {url}: {e}")
|
||||||
|
raise
|
||||||
|
|
||||||
|
def _register_remote_service(
|
||||||
|
self, remote_url: str, api_key: str, service_info: Dict[str, Any], server_name: str
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Register a remote service as a local service class.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
remote_url: Base URL of the remote server
|
||||||
|
api_key: API key for authentication
|
||||||
|
service_info: Service metadata from the remote server
|
||||||
|
server_name: Friendly name of the remote server
|
||||||
|
"""
|
||||||
|
service_tag = service_info.get("tag")
|
||||||
|
if not service_tag:
|
||||||
|
log.warning(f"Service info missing 'tag': {service_info}")
|
||||||
|
return
|
||||||
|
|
||||||
|
# Create a unique tag for the remote service
|
||||||
|
# Use "remote_" prefix to distinguish from local services
|
||||||
|
remote_tag = f"remote_{service_tag}"
|
||||||
|
|
||||||
|
# Check if this remote service is already registered
|
||||||
|
if remote_tag in self.remote_services:
|
||||||
|
log.debug(f"Remote service {remote_tag} already registered, skipping")
|
||||||
|
return
|
||||||
|
|
||||||
|
log.info(f"Registering remote service: {remote_tag} from {server_name}")
|
||||||
|
|
||||||
|
# Create a dynamic class that inherits from RemoteService
|
||||||
|
# This allows us to create instances with the cli() method for Click integration
|
||||||
|
class DynamicRemoteService(RemoteService):
|
||||||
|
"""Dynamically created remote service class."""
|
||||||
|
|
||||||
|
def __init__(self, ctx, **kwargs):
|
||||||
|
super().__init__(
|
||||||
|
ctx=ctx,
|
||||||
|
remote_url=remote_url,
|
||||||
|
api_key=api_key,
|
||||||
|
service_tag=service_tag,
|
||||||
|
service_metadata=service_info,
|
||||||
|
**kwargs,
|
||||||
|
)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def cli():
|
||||||
|
"""CLI method for Click integration."""
|
||||||
|
import click
|
||||||
|
|
||||||
|
# Create a dynamic Click command for this service
|
||||||
|
@click.command(
|
||||||
|
name=remote_tag,
|
||||||
|
short_help=f"Remote: {service_info.get('help', service_tag)}",
|
||||||
|
help=service_info.get("help", f"Remote service for {service_tag}"),
|
||||||
|
)
|
||||||
|
@click.argument("title", type=str, required=False)
|
||||||
|
@click.option("-q", "--query", type=str, help="Search query")
|
||||||
|
@click.pass_context
|
||||||
|
def remote_service_cli(ctx, title=None, query=None, **kwargs):
|
||||||
|
# Combine title and kwargs
|
||||||
|
params = {**kwargs}
|
||||||
|
if title:
|
||||||
|
params["title"] = title
|
||||||
|
if query:
|
||||||
|
params["query"] = query
|
||||||
|
|
||||||
|
return DynamicRemoteService(ctx, **params)
|
||||||
|
|
||||||
|
return remote_service_cli
|
||||||
|
|
||||||
|
# Set class name for better debugging
|
||||||
|
DynamicRemoteService.__name__ = remote_tag
|
||||||
|
DynamicRemoteService.__module__ = "unshackle.remote_services"
|
||||||
|
|
||||||
|
# Set GEOFENCE and ALIASES
|
||||||
|
if "geofence" in service_info:
|
||||||
|
DynamicRemoteService.GEOFENCE = tuple(service_info["geofence"])
|
||||||
|
if "aliases" in service_info:
|
||||||
|
# Add "remote_" prefix to aliases too
|
||||||
|
DynamicRemoteService.ALIASES = tuple(f"remote_{alias}" for alias in service_info["aliases"])
|
||||||
|
|
||||||
|
# Register the service
|
||||||
|
self.remote_services[remote_tag] = DynamicRemoteService
|
||||||
|
|
||||||
|
def get_service(self, tag: str) -> Optional[type]:
|
||||||
|
"""
|
||||||
|
Get a remote service class by tag.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
tag: Service tag (e.g., "remote_DSNP")
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
RemoteService class or None if not found
|
||||||
|
"""
|
||||||
|
return self.remote_services.get(tag)
|
||||||
|
|
||||||
|
def get_all_services(self) -> Dict[str, type]:
|
||||||
|
"""
|
||||||
|
Get all registered remote services.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary mapping service tags to RemoteService classes
|
||||||
|
"""
|
||||||
|
return self.remote_services.copy()
|
||||||
|
|
||||||
|
def get_service_path(self, tag: str) -> Optional[Path]:
|
||||||
|
"""
|
||||||
|
Get the path for a remote service.
|
||||||
|
|
||||||
|
Remote services don't have local paths, so this returns None.
|
||||||
|
This method exists for compatibility with the Services interface.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
tag: Service tag
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
None (remote services have no local path)
|
||||||
|
"""
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
# Global instance
|
||||||
|
_remote_service_manager: Optional[RemoteServiceManager] = None
|
||||||
|
|
||||||
|
|
||||||
|
def get_remote_service_manager() -> RemoteServiceManager:
|
||||||
|
"""
|
||||||
|
Get the global RemoteServiceManager instance.
|
||||||
|
|
||||||
|
Creates the instance on first call and discovers services.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
RemoteServiceManager instance
|
||||||
|
"""
|
||||||
|
global _remote_service_manager
|
||||||
|
|
||||||
|
if _remote_service_manager is None:
|
||||||
|
_remote_service_manager = RemoteServiceManager()
|
||||||
|
try:
|
||||||
|
_remote_service_manager.discover_services()
|
||||||
|
except Exception as e:
|
||||||
|
log.error(f"Failed to discover remote services: {e}")
|
||||||
|
|
||||||
|
return _remote_service_manager
|
||||||
|
|
||||||
|
|
||||||
|
__all__ = ("RemoteServiceManager", "get_remote_service_manager")
|
||||||
@@ -25,6 +25,17 @@ class Services(click.MultiCommand):
|
|||||||
|
|
||||||
# Click-specific methods
|
# Click-specific methods
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _get_remote_services():
|
||||||
|
"""Get remote services from the manager (lazy import to avoid circular dependency)."""
|
||||||
|
try:
|
||||||
|
from unshackle.core.remote_services import get_remote_service_manager
|
||||||
|
|
||||||
|
manager = get_remote_service_manager()
|
||||||
|
return manager.get_all_services()
|
||||||
|
except Exception:
|
||||||
|
return {}
|
||||||
|
|
||||||
def list_commands(self, ctx: click.Context) -> list[str]:
|
def list_commands(self, ctx: click.Context) -> list[str]:
|
||||||
"""Returns a list of all available Services as command names for Click."""
|
"""Returns a list of all available Services as command names for Click."""
|
||||||
return Services.get_tags()
|
return Services.get_tags()
|
||||||
@@ -43,6 +54,8 @@ class Services(click.MultiCommand):
|
|||||||
raise click.ClickException(f"{e}. Available Services: {', '.join(available_services)}")
|
raise click.ClickException(f"{e}. Available Services: {', '.join(available_services)}")
|
||||||
|
|
||||||
if hasattr(service, "cli"):
|
if hasattr(service, "cli"):
|
||||||
|
if callable(service.cli):
|
||||||
|
return service.cli()
|
||||||
return service.cli
|
return service.cli
|
||||||
|
|
||||||
raise click.ClickException(f"Service '{tag}' has no 'cli' method configured.")
|
raise click.ClickException(f"Service '{tag}' has no 'cli' method configured.")
|
||||||
@@ -51,13 +64,25 @@ class Services(click.MultiCommand):
|
|||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def get_tags() -> list[str]:
|
def get_tags() -> list[str]:
|
||||||
"""Returns a list of service tags from all available Services."""
|
"""Returns a list of service tags from all available Services (local + remote)."""
|
||||||
return [x.parent.stem for x in _SERVICES]
|
local_tags = [x.parent.stem for x in _SERVICES]
|
||||||
|
remote_services = Services._get_remote_services()
|
||||||
|
remote_tags = list(remote_services.keys())
|
||||||
|
return local_tags + remote_tags
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def get_path(name: str) -> Path:
|
def get_path(name: str) -> Path:
|
||||||
"""Get the directory path of a command."""
|
"""Get the directory path of a command."""
|
||||||
tag = Services.get_tag(name)
|
tag = Services.get_tag(name)
|
||||||
|
|
||||||
|
# Check if it's a remote service
|
||||||
|
remote_services = Services._get_remote_services()
|
||||||
|
if tag in remote_services:
|
||||||
|
# Remote services don't have local paths
|
||||||
|
# Return a dummy path or raise an appropriate error
|
||||||
|
# For now, we'll raise KeyError to indicate no path exists
|
||||||
|
raise KeyError(f"Remote service '{tag}' has no local path")
|
||||||
|
|
||||||
for service in _SERVICES:
|
for service in _SERVICES:
|
||||||
if service.parent.stem == tag:
|
if service.parent.stem == tag:
|
||||||
return service.parent
|
return service.parent
|
||||||
@@ -72,19 +97,38 @@ class Services(click.MultiCommand):
|
|||||||
"""
|
"""
|
||||||
original_value = value
|
original_value = value
|
||||||
value = value.lower()
|
value = value.lower()
|
||||||
|
|
||||||
|
# Check local services
|
||||||
for path in _SERVICES:
|
for path in _SERVICES:
|
||||||
tag = path.parent.stem
|
tag = path.parent.stem
|
||||||
if value in (tag.lower(), *_ALIASES.get(tag, [])):
|
if value in (tag.lower(), *_ALIASES.get(tag, [])):
|
||||||
return tag
|
return tag
|
||||||
|
|
||||||
|
# Check remote services
|
||||||
|
remote_services = Services._get_remote_services()
|
||||||
|
for tag, service_class in remote_services.items():
|
||||||
|
if value == tag.lower():
|
||||||
|
return tag
|
||||||
|
if hasattr(service_class, "ALIASES"):
|
||||||
|
if value in (alias.lower() for alias in service_class.ALIASES):
|
||||||
|
return tag
|
||||||
|
|
||||||
return original_value
|
return original_value
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def load(tag: str) -> Service:
|
def load(tag: str) -> Service:
|
||||||
"""Load a Service module by Service tag."""
|
"""Load a Service module by Service tag (local or remote)."""
|
||||||
|
# Check local services first
|
||||||
module = _MODULES.get(tag)
|
module = _MODULES.get(tag)
|
||||||
if not module:
|
if module:
|
||||||
raise KeyError(f"There is no Service added by the Tag '{tag}'")
|
|
||||||
return module
|
return module
|
||||||
|
|
||||||
|
# Check remote services
|
||||||
|
remote_services = Services._get_remote_services()
|
||||||
|
if tag in remote_services:
|
||||||
|
return remote_services[tag]
|
||||||
|
|
||||||
|
raise KeyError(f"There is no Service added by the Tag '{tag}'")
|
||||||
|
|
||||||
|
|
||||||
__all__ = ("Services",)
|
__all__ = ("Services",)
|
||||||
|
|||||||
2
uv.lock
generated
2
uv.lock
generated
@@ -1589,7 +1589,6 @@ dependencies = [
|
|||||||
{ name = "protobuf" },
|
{ name = "protobuf" },
|
||||||
{ name = "pycaption" },
|
{ name = "pycaption" },
|
||||||
{ name = "pycryptodomex" },
|
{ name = "pycryptodomex" },
|
||||||
{ name = "pyexecjs" },
|
|
||||||
{ name = "pyjwt" },
|
{ name = "pyjwt" },
|
||||||
{ name = "pymediainfo" },
|
{ name = "pymediainfo" },
|
||||||
{ name = "pymp4" },
|
{ name = "pymp4" },
|
||||||
@@ -1641,7 +1640,6 @@ requires-dist = [
|
|||||||
{ name = "protobuf", specifier = ">=4.25.3,<5" },
|
{ name = "protobuf", specifier = ">=4.25.3,<5" },
|
||||||
{ name = "pycaption", specifier = ">=2.2.6,<3" },
|
{ name = "pycaption", specifier = ">=2.2.6,<3" },
|
||||||
{ name = "pycryptodomex", specifier = ">=3.20.0,<4" },
|
{ name = "pycryptodomex", specifier = ">=3.20.0,<4" },
|
||||||
{ name = "pyexecjs", specifier = ">=1.5.1" },
|
|
||||||
{ name = "pyjwt", specifier = ">=2.8.0,<3" },
|
{ name = "pyjwt", specifier = ">=2.8.0,<3" },
|
||||||
{ name = "pymediainfo", specifier = ">=6.1.0,<7" },
|
{ name = "pymediainfo", specifier = ">=6.1.0,<7" },
|
||||||
{ name = "pymp4", specifier = ">=1.4.0,<2" },
|
{ name = "pymp4", specifier = ">=1.4.0,<2" },
|
||||||
|
|||||||
Reference in New Issue
Block a user