LocalAI updates #1222
Replies: 1 comment
-
HI @mudler. Thanks for your announcement. It is good to let people know we are focusing on As many years of experience on open-source projects. The most important point for a community-driven project is how to keeping stable, clearly roadmap and the same quality of code, as more people join the community as members. I was the maintainer of several Helm charts in Helm repo several years ago. I can help on Helm charts Repo. And currently, I am focusing on implement the Rust backend with burn ML. After this milestone, I believe I will be more familiar with LLM programming. So, I also can help on the external backend(like Python series). |
Beta Was this translation helpful? Give feedback.
-
Hey everyone,
I hope you're all doing well. I wanted to provide you with a project update regarding our ongoing work on LocalAI. We're gearing up for a significant milestone with the upcoming release of version v2, which includes a substantial overhaul of our internal backends. You can stay up to date with all the developments on our GitHub issue tracker: Backends v2. Before the v2 release I'm planning to have an intermediate release with the additional c++ backend.
More native backends
As the internal of LocalAI switched to a
gRPC
structure ( #743 ), a set of new backends allowed to integrate additional features that wasn't possible before. This also allowed to work on a C++ backend for llama.cpp to help keeping the codebase closer with upstream (#1170). This now makes several backends deprecated, and now by result the LocalAI code-base is less go-based, but go is still used as an API and for few legacy backends.As we move forward, there are a few important changes and developments to note:
Archiving Old Backends and Repositories
To streamline our efforts and reduce maintenance overhead, we'll be archiving some old backends and repositories. This will help us focus on the most critical aspects of the project. There are no real impacts besides liter images, and there will be no loss in any featureset (actually, we will gain more, see : Backends v2 !). Some of the backends which were created are now superseded by llama.cpp which now provides an extended model support ( for instance gpt-2, gptj, falcon, and more ).
Organizational Alignment
We've made some changes to the organization of our repositories. Some repositories have been moved to align more closely with the organization's scope, specifically within the "go-skynet" umbrella. In go-skynet you will find now on golang bindings and we are archiving bindings that are no longer receiving updates. You will find some repositories been transfered, so you won't even notice probably :)
Change in Repository Ownership
You may have noticed that the repository has been moved from an organizational account back to my personal user account. This change serves two main purposes:
Maintainers needed
I've attempted to find a home within the "go-skynet" organization for community-related projects, such as UI and the model gallery, but unfortunately, we couldn't find maintainers, and these side projects quickly became stagnant.
The LocalAI community is amazing - and I'm extremely grateful for where we are now and I'm not pointing fingers, I think it's essential to acknowledge that community-driven projects require active engagement from the community itself, and be transparent to you is also very important to me. Maintaining all these pieces alone is simply not feasible. I'm trying to encourage community members in this way to step up and take ownership for other portions so I can better focus on the LocalAI development.
Given these changes, there are several repositories in need of new maintainers, even if I will keep maintaining until they find a new home and/or required by LocalAI:
If you're interested in taking up the responsibility of maintaining any of these repositories above or have any questions, please open up an issue in the respective repository by pinging me or contact me on Discord - a first step in would be of course, also to start contributing! Your contribution will be highly appreciated as we continue to evolve and improve LocalAI together!
Thank you for your continued support, and let's keep the LocalAI community thriving!
Cheers,
Ettore
Beta Was this translation helpful? Give feedback.
All reactions