e52c487d76
- update to 3.1.2: * fix(langchain): do not stringify metadata unnecessarily * chore: add good .env defaults * fix(openai): add check for metadata NotGiven * fix(scores): skip sampling if OTEL sampler not available * fix(dataset-run-items): correctly typ datasetRunItems.list - update to 3.1.1: * fix(client): do not escape url param with httpx > 0.28 - update to 3.1.0: * feat(prompts): chat message placeholders - update to 3.0.8: * feat(langchain): add last trace id property to CallbackHandler * fix(openai): chat.completions.parse out of beta - update to 2.60.9: * fix(openai): chat completions parse out of beta (sdk-v2) - update to 3.0.7: * fix(client): auth check on missing credentials * fix(flushing): rely on OTEL default flush settings - update to 3.0.6: * fix(prompts): escape json in get_langchain_prompt - update to 3.0.5: * feat(client): allow filtering spans by instrumentation scope * fix(observe): default IO capture on decorated functions - update to 3.0.4: * fix(client): use tracing_enabled in get_client - update to 3.0.3: * fix(client): correctly scope client in multiproject setups * chore(deps): bump protobuf from 5.29.4 to 5.29.5 - update to 3.0.2:
Dirk Mueller2025-07-05 10:29:58 +00:00
4d463bba64
Accepting request 1268730 from devel:languages:python
Ana Guerrero2025-04-14 10:56:00 +00:00
fe33f6cd74
- update to 2.60.2: * fix(cost-tracking): handle none values in OpenAI schema - update to 2.60.1: * fix(openai): remove unused openai.resources import - update to 2.60.0: * feat(openai): add Response API support * fix(ingestion_consumer): mask before multimodal handling - update to 2.59.7: * feat(client): add native environments - update to 2.59.6: * fix(openai): handle missing text property on streamed completion - update to 2.59.5: * Resolve runtime error with openai extension when metadata is missing * fix(openai): apply langfuse_mask - update to 2.59.4: * fix(langchain): cached token usage - update to 2.59.3: * fix(openai): implement aclose on async stream responses - update to 2.59.2: * fix(serializer): NaN handling * feat(prompts): add commit message to prompt creation - update to 2.59.1: * perf(ingestion): make max event and batch size configurable - update to 2.59.0: * feat(api): expose public api client * feat(client): add async api client - update to 2.58.2: * fix(openai): handle usage object without mutation
Dirk Mueller2025-04-11 21:00:58 +00:00
c69014b800
Accepting request 1225319 from devel:languages:python
Ana Guerrero2024-11-20 16:44:43 +00:00
568223b8bc
- update to 2.54.1: * fix(media): allow setting IO media via decorator update - update to 2.54.0: * feat(core): add multimodal support * fix(openai): pass parsed_n only if greater 1 - update to 2.53.9: * perf: move serialization to background threads - update to 2.53.8: * fix(datasets): encoding - update to 2.53.7: * fix(openai): revert default stream option setting - update to 2.53.6: * fix(serializer): reduce log level to debug on failed serialization - update to 2.53.5: * fix(serializer): pydantic compat v1 v2 - update to 2.53.4: * feat(openai): parse usage if stream_options has include_usage - update to 2.53.3: * fix(datasets): url encode dataset name and run name * refactor(llama-index): send generation updates directly from event handler - update to 2.53.2: * fix(llama-index): CompletionResponse Serialization by @hassiebp - update to 2.53.1: * fix: 'NoneType' object has no attribute '__dict__' - update to 2.53.0: * feat(client): allow masking event input and output by @shawnzhu and @hassiebp in
Dirk Mueller2024-11-20 15:21:45 +00:00