Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore(deps): update dependency mlflow to v2.18.0 #31

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

anaconda-renovate[bot]
Copy link
Contributor

@anaconda-renovate anaconda-renovate bot commented Sep 2, 2024

This PR contains the following updates:

Package Update Change
mlflow minor ==2.6.0 -> ==2.18.0

Warning

Some dependencies could not be looked up. Check the Dependency Dashboard for more information.


Release Notes

mlflow/mlflow (mlflow)

v2.18.0

Compare Source

We are excited to announce the release of MLflow 2.18.0! This release includes a number of significant features, enhancements, and bug fixes.

Python Version Update

Python 3.8 is now at an end-of-life point. With official support being dropped for this legacy version, MLflow now requires Python 3.9
as a minimum supported version.

Note: If you are currently using MLflow's ChatModel interface for authoring custom GenAI applications, please ensure that you
have read the future breaking changes section below.

Major New Features
  • 🦺 Fluent API Thread/Process Safety - MLflow's fluent APIs for tracking and the model registry have been overhauled to add support for both thread and multi-process safety. You are now no longer forced to use the Client APIs for managing experiments, runs, and logging from within multiprocessing and threaded applications. (#​13456, #​13419, @​WeichenXu123)

  • 🧩 DSPy flavor - MLflow now supports logging, loading, and tracing of DSPy models, broadening the support for advanced GenAI authoring within MLflow. Check out the MLflow DSPy Flavor documentation to get started! (#​13131, #​13279, #​13369, #​13345, @​chenmoneygithub, #​13543, #​13800, #​13807, @​B-Step62, #​13289, @​michael-berk)

  • 🖥️ Enhanced Trace UI - MLflow Tracing's UI has undergone
    a significant overhaul to bring usability and quality of life updates to the experience of auditing and investigating the contents of GenAI traces, from enhanced span content rendering using markdown to a standardized span component structure, (#​13685, #​13357, #​13242, @​daniellok-db)

  • 🚄 New Tracing Integrations - MLflow Tracing now supports DSPy, LiteLLM, and Google Gemini, enabling a one-line, fully automated tracing experience. These integrations unlock enhanced observability across a broader range of industry tools. Stay tuned for upcoming integrations and updates! (#​13801, @​TomeHirata, #​13585, @​B-Step62)

  • 📊 Expanded LLM-as-a-Judge Support - MLflow now enhances its evaluation capabilities with support for additional providers, including Anthropic, Bedrock, Mistral, and TogetherAI, alongside existing providers like OpenAI. Users can now also configure proxy endpoints or self-hosted LLMs that follow the provider API specs by using the new proxy_url and extra_headers options. Visit the LLM-as-a-Judge documentation for more details! (#​13715, #​13717, @​B-Step62)

  • ⏰ Environment Variable Detection - As a helpful reminder for when you are deploying models, MLflow now detects and reminds users of environment variables set during model logging, ensuring they are configured for deployment. In addition to this, the mlflow.models.predict utility has also been updated to include these variables in serving simulations, improving pre-deployment validation. (#​13584, @​serena-ruan)

Breaking Changes to ChatModel Interface
  • ChatModel Interface Updates - As part of a broader unification effort within MLflow and services that rely on or deeply integrate
    with MLflow's GenAI features, we are working on a phased approach to making a consistent and standard interface for custom GenAI
    application development and usage. In the first phase (planned for release in the next few releases of MLflow), we are marking
    several interfaces as deprecated, as they will be changing. These changes will be:

    • Renaming of Interfaces:
      • ChatRequestChatCompletionRequest to provide disambiguation for future planned request interfaces.
      • ChatResponseChatCompletionResponse for the same reason as the input interface.
      • metadata fields within ChatRequest and ChatResponsecustom_inputs and custom_outputs, respectively.
    • Streaming Updates:
      • predict_stream will be updated to enable true streaming for custom GenAI applications. Currently, it returns a generator with synchronous outputs from predict. In a future release, it will return a generator of ChatCompletionChunks, enabling asynchronous streaming. While the API call structure will remain the same, the returned data payload will change significantly, aligning with LangChain’s implementation.
    • Legacy Dataclass Deprecation:
      • Dataclasses in mlflow.models.rag_signatures will be deprecated, merging into unified ChatCompletionRequest, ChatCompletionResponse, and ChatCompletionChunks.

Other Features:

Bug fixes:

  • [Database] Cascade deletes to datasets when deleting experiments to fix a bug in MLflow's gc command when deleting experiments with logged datasets (#​13741, @​daniellok-db)
  • [Models] Fix a bug with Langchain's pyfunc predict input conversion (#​13652, @​serena-ruan)
  • [Models] Fix signature inference for subclasses and Optional dataclasses that define a model's signature (#​13440, @​bbqiu)
  • [Tracking] Fix an issue with async logging batch splitting validation rules (#​13722, @​WeichenXu123)
  • [Tracking] Fix an issue with LangChain's autologging thread-safety behavior (#​13672, @​B-Step62)
  • [Tracking] Disable support for running spark autologging in a threadpool due to limitations in Spark (#​13599, @​WeichenXu123)
  • [Tracking] Mark role and index as required for chat schema (#​13279, @​chenmoneygithub)
  • [Tracing] Handle raw response in openai autolog (#​13802, @​harupy)
  • [Tracing] Fix a bug with tracing source run behavior when running inference with multithreading on Langchain models (#​13610, @​WeichenXu123)

Documentation updates:

Small bug fixes and documentation updates:

#​13775, #​13768, #​13764, #​13744, #​13699, #​13742, #​13703, #​13669, #​13682, #​13569, #​13563, #​13562, #​13539, #​13537, #​13533, #​13408, #​13295, @​serena-ruan; #​13768, #​13764, #​13761, #​13738, #​13737, #​13735, #​13734, #​13723, #​13726, #​13662, #​13692, #​13689, #​13688, #​13680, #​13674, #​13666, #​13661, #​13625, #​13460, #​13626, #​13546, #​13621, #​13623, #​13603, #​13617, #​13614, #​13606, #​13600, #​13583, #​13601, #​13602, #​13604, #​13598, #​13596, #​13597, #​13531, #​13594, #​13589, #​13581, #​13112, #​13587, #​13582, #​13579, #​13578, #​13545, #​13572, #​13571, #​13564, #​13559, #​13565, #​13558, #​13541, #​13560, #​13556, #​13534, #​13386, #​13532, #​13385, #​13384, #​13383, #​13507, #​13523, #​13518, #​13492, #​13493, #​13487, #​13490, #​13488, #​13449, #​13471, #​13417, #​13445, #​13430, #​13448, #​13443, #​13429, #​13418, #​13412, #​13382, #​13402, #​13381, #​13364, #​13356, #​13309, #​13313, #​13334, #​13331, #​13273, #​13322, #​13319, #​13308, #​13302, #​13268, #​13298, #​13296, @​harupy; #​13705, @​williamjamir; #​13632, @​shichengzhou-db; #​13755, #​13712, #​13260, @​BenWilson2; #​13745, #​13743, #​13697, #​13548, #​13549, #​13577, #​13349, #​13351, #​13350, #​13342, #​13341, @​WeichenXu123; #​13807, #​13798, #​13787, #​13786, #​13762, #​13749, #​13733, #​13678, #​13721, #​13611, #​13528, #​13444, #​13450, #​13360, #​13416, #​13415, #​13336, #​13305, #​13271, @​B-Step62; #​13808, #​13708, @​smurching; #​13739, @​fedorkobak; #​13728, #​13719, #​13695, #​13677, @​TomeHirata; #​13776, #​13736, #​13649, #​13285, #​13292, #​13282, #​13283, #​13267, @​daniellok-db; #​13711, @​bhavya2109sharma; #​13693, #​13658, @​aravind-segu; #​13553, @​dsuhinin; #​13663, @​gitlijian; #​13657, #​13629, @​parag-shendye; #​13630, @​JohannesJungbluth; #​13613, @​itepifanio; #​13480, @​agjendem; #​13627, @​ilyaresh; #​13592, #​13410, #​13358, #​13233, @​nojaf; #​13660, #​13505, @​sunishsheth2009; #​13414, @​lmoros-DB; #​13399, @​Abubakar17; #​13390, @​KekmaTime; #​13291, @​michael-berk; #​12511, @​jgiannuzzi; #​13265, @​Ahar28; #​13785, @​Rick-McCoy; #​13676, @​hyolim-e; #​13718, @​annzhang-db; #​13705, @​williamjamir

v2.17.2

Compare Source

MLflow 2.17.2 includes several major features and improvements

Features:

Bug fixes:

Documentation updates:

Small bug fixes and documentation updates:

#​13569, @​serena-ruan; #​13595, @​BenWilson2; #​13593, @​mnijhuis-dnb;

v2.17.1

Compare Source

MLflow 2.17.1 includes several major features and improvements

Features:

Bug fixes:

Documentation updates:

Small bug fixes and documentation updates:

#​13293, #​13510, #​13501, #​13506, #​13446, @​harupy; #​13341, #​13342, @​WeichenXu123; #​13396, @​dvorst; #​13535, @​chenmoneygithub; #​13503, #​13469, #​13416, @​B-Step62; #​13519, #​13516, @​serena-ruan; #​13504, @​sunishsheth2009; #​13508, @​KamilStachera; #​13397, @​kriscon-db

v2.17.0

Compare Source

We are excited to announce the release of MLflow 2.17.0! This release includes several enhancements to extend the
functionality of MLflow's ChatModel interface to further extend its versatility for handling custom GenAI application use cases.
Additionally, we've improved the interface within the tracing UI to provide a structured output for retrieved documents,
enhancing the ability to read the contents of those documents within the UI.
We're also starting the work on improving both the utility and the versatility of MLflow's evaluate functionality for GenAI,
initially with support for callable GenAI evaluation metrics.

Major Features and notifications:
  • ChatModel enhancements - As the GenAI-focused 'cousin' of PythonModel, ChatModel is getting some sizable functionality
    extensions. From native support for tool calling (a requirement for creating a custom agent), simpler conversions to the
    internal dataclass constructs needed to interface with ChatModel via the introduction of from_dict methods to all data structures,
    the addition of a metadata field to allow for full input payload customization, handling of the new refusal response type, to the
    inclusion of the interface type to the response structure to allow for greater integration compatibility.
    (#​13191, #​13180, #​13143, @​daniellok-db, #​13102, #​13071, @​BenWilson2)

  • Callable GenAI Evaluation Metrics - As the intial step in a much broader expansion of the functionalities of mlflow.evaluate for
    GenAI use cases, we've converted the GenAI evaluation metrics to be callable. This allows you to use them directly in packages that support
    callable GenAI evaluation metrics, as well as making it simpler to debug individual responses when prototyping solutions. (#​13144, @​serena-ruan)

  • Audio file support in the MLflow UI - You can now directly 'view' audio files that have been logged and listen to them from within the MLflow UI's
    artifact viewer pane.

  • MLflow AI Gateway is no longer deprecated - We've decided to revert our deprecation for the AI Gateway feature. We had renamed it to the
    MLflow Deployments Server, but have reconsidered and reverted the naming and namespace back to the original configuration.

Features:

Bug fixes:

  • [Tracking] Fix tracing for LangGraph (#​13215, @​B-Step62)
  • [Tracking] Fix an issue with presigned_url_artifact requests being in the wrong format (#​13366, @​WeichenXu123)
  • [Models] Update Databricks dependency extraction functionality to work with the langchain-databricks partner package. (#​13266, @​B-Step62)
  • [Model Registry] Fix retry and credential refresh issues with artifact downloads from the model registry (#​12935, @​rohitarun-db)
  • [Tracking] Fix LangChain autologging so that langchain-community is not required for partner packages (#​13172, @​B-Step62)
  • [Artifacts] Fix issues with file removal for the local artifact repository (#​13005, @​rzalawad)

Documentation updates:

Small bug fixes and documentation updates:

#​13372, #​13271, #​13243, #​13226, #​13190, #​13230, #​13208, #​13130, #​13045, #​13094, @​B-Step62; #​13302, #​13238, #​13234, #​13205, #​13200, #​13196, #​13198, #​13193, #​13192, #​13194, #​13189, #​13184, #​13182, #​13161, #​13179, #​13178, #​13110, #​13162, #​13173, #​13171, #​13169, #​13168, #​13167, #​13156, #​13127, #​13133, #​13089, #​13073, #​13057, #​13058, #​13067, #​13062, #​13061, #​13052, @​harupy; #​13295, #​13219, #​13038, @​serena-ruan; #​13176, #​13164, @​WeichenXu123; #​13163, @​gabrielfu; #​13186, @​varshinimuthukumar1; #​13128, #​13115, @​nojaf; #​13120, @​levscaut; #​13152, #​13075, @​BenWilson2; #​13138, @​tanguylefloch-veesion; #​13087, @​SeanAverS; #​13285, #​13051, #​13043, @​daniellok-db; #​13224, @​levscaut;

v2.16.2

Compare Source

MLflow 2.16.2 includes several major features and improvements

Bug fixes:

v2.16.1

Compare Source

MLflow 2.16.1 is a patch release that includes some minor feature improvements and addresses several bug fixes.

Features:

  • [Tracing] Add Support for an Open Telemetry compatible exporter to configure external sinks for MLflow traces (#​13118, @​B-Step62)
  • [Model Registry, AWS] Add support for utilizing AWS KMS-based encryption for the MLflow Model Registry (#​12495, @​artjen)
  • [Model Registry] Add support for using the OSS Unity Catalog server as a Model Registry (#​13034, #​13065, #​13066, @​rohitarun-db)
  • [Models] Introduce path-based transformers logging to reduce memory requirements for saving large transformers models (#​13070, @​B-Step62)

Bug fixes:

  • [Tracking] Fix a data payload size issue with Model.get_tags_dict by eliminating the return of the internally-used config field (#​13086, @​harshilprajapati96)
  • [Models] Fix an issue with LangChain Agents where sub-dependencies were not being properly extracted (#​13105, @​aravind-segu)
  • [Tracking] Fix an issue where the wrong checkpoint for the current best model in auto checkpointing was being selected (#​12981, @​hareeen)
  • [Tracking] Fix an issue where local timezones for trace initialization were not being taken into account in AutoGen tracing (#​13047, @​B-Step62)

Documentation updates:

Small bug fixes and documentation updates:

#​13140, #​13141, #​13098, #​13091, #​13101, #​13100, #​13095, #​13044, #​13048, @​B-Step62; #​13142, #​13092, #​13132, #​13055, #​13049, @​harupy; #​13135, #​13036, #​13029, @​serena-ruan; #​13134, #​13081, #​13078, @​daniellok-db; #​13107, #​13103, @​kriscon-db; #​13104, @​arpitjasa-db; #​13022, @​nojaf; #​13069, @​minihat; #​12879, @​faizankshaikh

v2.16.0

Compare Source

We are excited to announce the release of MLflow 2.16.0. This release includes many major features and improvements!

Major features:
  • LlamaIndex Enhancements🦙 - to provide additional flexibility to the LlamaIndex integration, we now have support for the models-from-code functionality for logging, extended engine-based logging, and broadened support for external vector stores.

  • LangGraph Support - We've expanded the LangChain integration to support the agent framework LangGraph. With tracing and support for logging using the models-from-code feature, creating and storing agent applications has never been easier!

  • AutoGen Tracing - Full automatic support for tracing multi-turn agent applications built with Microsoft's AutoGen framework is now available in MLflow. Enabling autologging via mlflow.autogen.autolog() will instrument your agents built with AutoGen.

  • Plugin support for AI Gateway - You can now define your own provider interfaces that will work with MLflow's AI Gateway (also known as the MLflow Deployments Server). Creating an installable provider definition will allow you to connect the Gateway server to any GenAI service of your choosing.

Features:

Bug fixes:


Configuration

📅 Schedule: Branch creation - "every weekday" in timezone UTC, Automerge - At any time (no schedule defined).

🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.

Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

🔕 Ignore: Close this PR and you won't be reminded about this update again.


  • If you want to rebase/retry this PR, check this box

This PR has been generated by Renovate Bot.

@anaconda-renovate anaconda-renovate bot changed the title chore(deps): update dependency mlflow to v2.16.0 chore(deps): update dependency mlflow to v2.16.1 Sep 14, 2024
@anaconda-renovate anaconda-renovate bot changed the title chore(deps): update dependency mlflow to v2.16.1 chore(deps): update dependency mlflow to v2.16.2 Sep 17, 2024
Copy link
Contributor Author

Edited/Blocked Notification

Renovate will not automatically rebase this PR, because it does not recognize the last commit author and assumes somebody else may have edited the PR.

You can manually request rebase by checking the rebase/retry box above.

⚠️ Warning: custom changes will be lost.

@anaconda-renovate anaconda-renovate bot changed the title chore(deps): update dependency mlflow to v2.16.2 chore(deps): update dependency mlflow to v2.17.0 Oct 12, 2024
@anaconda-renovate anaconda-renovate bot changed the title chore(deps): update dependency mlflow to v2.17.0 chore(deps): update dependency mlflow to v2.17.1 Oct 26, 2024
@anaconda-renovate anaconda-renovate bot changed the title chore(deps): update dependency mlflow to v2.17.1 chore(deps): update dependency mlflow to v2.17.2 Oct 31, 2024
@anaconda-renovate anaconda-renovate bot changed the title chore(deps): update dependency mlflow to v2.17.2 chore(deps): update dependency mlflow to v2.18.0 Nov 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0 participants