-
Notifications
You must be signed in to change notification settings - Fork 800
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
Showing
36 changed files
with
817 additions
and
910 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,8 @@ | ||
# Ignore artifacts: | ||
.github | ||
dist | ||
docs | ||
examples | ||
scripts | ||
types | ||
*.md |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1 @@ | ||
{} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -11,14 +11,14 @@ | |
</p> | ||
|
||
<p align="center"> | ||
<a href="https://www.npmjs.com/package/@xenova/transformers"> | ||
<img alt="NPM" src="https://img.shields.io/npm/v/@xenova/transformers"> | ||
<a href="https://www.npmjs.com/package/@huggingface/transformers"> | ||
<img alt="NPM" src="https://img.shields.io/npm/v/@huggingface/transformers"> | ||
</a> | ||
<a href="https://www.npmjs.com/package/@xenova/transformers"> | ||
<img alt="NPM Downloads" src="https://img.shields.io/npm/dw/@xenova/transformers"> | ||
<a href="https://www.npmjs.com/package/@huggingface/transformers"> | ||
<img alt="NPM Downloads" src="https://img.shields.io/npm/dw/@huggingface/transformers"> | ||
</a> | ||
<a href="https://www.jsdelivr.com/package/npm/@xenova/transformers"> | ||
<img alt="jsDelivr Hits" src="https://img.shields.io/jsdelivr/npm/hw/@xenova/transformers"> | ||
<a href="https://www.jsdelivr.com/package/npm/@huggingface/transformers"> | ||
<img alt="jsDelivr Hits" src="https://img.shields.io/jsdelivr/npm/hw/@huggingface/transformers"> | ||
</a> | ||
<a href="https://github.com/xenova/transformers.js/blob/main/LICENSE"> | ||
<img alt="License" src="https://img.shields.io/github/license/xenova/transformers.js?color=blue"> | ||
|
@@ -69,7 +69,7 @@ out = pipe('I love transformers!') | |
<td> | ||
|
||
```javascript | ||
import { pipeline } from '@xenova/transformers'; | ||
import { pipeline } from '@huggingface/transformers'; | ||
|
||
// Allocate a pipeline for sentiment-analysis | ||
let pipe = await pipeline('sentiment-analysis'); | ||
|
@@ -93,15 +93,15 @@ let pipe = await pipeline('sentiment-analysis', 'Xenova/bert-base-multilingual-u | |
## Installation | ||
|
||
|
||
To install via [NPM](https://www.npmjs.com/package/@xenova/transformers), run: | ||
To install via [NPM](https://www.npmjs.com/package/@huggingface/transformers), run: | ||
```bash | ||
npm i @xenova/transformers | ||
npm i @huggingface/transformers | ||
``` | ||
|
||
Alternatively, you can use it in vanilla JS, without any bundler, by using a CDN or static hosting. For example, using [ES Modules](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules), you can import the library with: | ||
```html | ||
<script type="module"> | ||
import { pipeline } from 'https://cdn.jsdelivr.net/npm/@xenova/[email protected].0'; | ||
import { pipeline } from 'https://cdn.jsdelivr.net/npm/@huggingface/[email protected].3'; | ||
</script> | ||
``` | ||
|
||
|
@@ -134,12 +134,12 @@ Check out the Transformers.js [template](https://huggingface.co/new-space?templa | |
|
||
|
||
|
||
By default, Transformers.js uses [hosted pretrained models](https://huggingface.co/models?library=transformers.js) and [precompiled WASM binaries](https://cdn.jsdelivr.net/npm/@xenova/[email protected].0/dist/), which should work out-of-the-box. You can customize this as follows: | ||
By default, Transformers.js uses [hosted pretrained models](https://huggingface.co/models?library=transformers.js) and [precompiled WASM binaries](https://cdn.jsdelivr.net/npm/@huggingface/[email protected].3/dist/), which should work out-of-the-box. You can customize this as follows: | ||
|
||
### Settings | ||
|
||
```javascript | ||
import { env } from '@xenova/transformers'; | ||
import { env } from '@huggingface/transformers'; | ||
|
||
// Specify a custom location for models (defaults to '/models/'). | ||
env.localModelPath = '/path/to/models/'; | ||
|
@@ -302,6 +302,7 @@ You can refine your search by selecting the task you're interested in (e.g., [te | |
1. **[FLAN-T5](https://huggingface.co/docs/transformers/model_doc/flan-t5)** (from Google AI) released in the repository [google-research/t5x](https://github.com/google-research/t5x/blob/main/docs/models.md#flan-t5-checkpoints) by Hyung Won Chung, Le Hou, Shayne Longpre, Barret Zoph, Yi Tay, William Fedus, Eric Li, Xuezhi Wang, Mostafa Dehghani, Siddhartha Brahma, Albert Webson, Shixiang Shane Gu, Zhuyun Dai, Mirac Suzgun, Xinyun Chen, Aakanksha Chowdhery, Sharan Narang, Gaurav Mishra, Adams Yu, Vincent Zhao, Yanping Huang, Andrew Dai, Hongkun Yu, Slav Petrov, Ed H. Chi, Jeff Dean, Jacob Devlin, Adam Roberts, Denny Zhou, Quoc V. Le, and Jason Wei | ||
1. **Florence2** (from Microsoft) released with the paper [Florence-2: Advancing a Unified Representation for a Variety of Vision Tasks](https://arxiv.org/abs/2311.06242) by Bin Xiao, Haiping Wu, Weijian Xu, Xiyang Dai, Houdong Hu, Yumao Lu, Michael Zeng, Ce Liu, Lu Yuan. | ||
1. **[Gemma](https://huggingface.co/docs/transformers/main/model_doc/gemma)** (from Google) released with the paper [Gemma: Open Models Based on Gemini Technology and Research](https://blog.google/technology/developers/gemma-open-models/) by the Gemma Google team. | ||
1. **[Gemma2](https://huggingface.co/docs/transformers/main/model_doc/gemma2)** (from Google) released with the paper [Gemma2: Open Models Based on Gemini Technology and Research](https://blog.google/technology/developers/google-gemma-2/) by the Gemma Google team. | ||
1. **[GLPN](https://huggingface.co/docs/transformers/model_doc/glpn)** (from KAIST) released with the paper [Global-Local Path Networks for Monocular Depth Estimation with Vertical CutDepth](https://arxiv.org/abs/2201.07436) by Doyeon Kim, Woonghyun Ga, Pyungwhan Ahn, Donggyu Joo, Sehwan Chun, Junmo Kim. | ||
1. **[GPT Neo](https://huggingface.co/docs/transformers/model_doc/gpt_neo)** (from EleutherAI) released in the repository [EleutherAI/gpt-neo](https://github.com/EleutherAI/gpt-neo) by Sid Black, Stella Biderman, Leo Gao, Phil Wang and Connor Leahy. | ||
1. **[GPT NeoX](https://huggingface.co/docs/transformers/model_doc/gpt_neox)** (from EleutherAI) released with the paper [GPT-NeoX-20B: An Open-Source Autoregressive Language Model](https://arxiv.org/abs/2204.06745) by Sid Black, Stella Biderman, Eric Hallahan, Quentin Anthony, Leo Gao, Laurence Golding, Horace He, Connor Leahy, Kyle McDonell, Jason Phang, Michael Pieler, USVSN Sai Prashanth, Shivanshu Purohit, Laria Reynolds, Jonathan Tow, Ben Wang, Samuel Weinbach | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,12 +1,12 @@ | ||
|
||
To install via [NPM](https://www.npmjs.com/package/@xenova/transformers), run: | ||
To install via [NPM](https://www.npmjs.com/package/@huggingface/transformers), run: | ||
```bash | ||
npm i @xenova/transformers | ||
npm i @huggingface/transformers | ||
``` | ||
|
||
Alternatively, you can use it in vanilla JS, without any bundler, by using a CDN or static hosting. For example, using [ES Modules](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules), you can import the library with: | ||
```html | ||
<script type="module"> | ||
import { pipeline } from 'https://cdn.jsdelivr.net/npm/@xenova/[email protected].0'; | ||
import { pipeline } from 'https://cdn.jsdelivr.net/npm/@huggingface/[email protected].3'; | ||
</script> | ||
``` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,11 +1,11 @@ | ||
|
||
|
||
By default, Transformers.js uses [hosted pretrained models](https://huggingface.co/models?library=transformers.js) and [precompiled WASM binaries](https://cdn.jsdelivr.net/npm/@xenova/[email protected].0/dist/), which should work out-of-the-box. You can customize this as follows: | ||
By default, Transformers.js uses [hosted pretrained models](https://huggingface.co/models?library=transformers.js) and [precompiled WASM binaries](https://cdn.jsdelivr.net/npm/@huggingface/[email protected].3/dist/), which should work out-of-the-box. You can customize this as follows: | ||
|
||
### Settings | ||
|
||
```javascript | ||
import { env } from '@xenova/transformers'; | ||
import { env } from '@huggingface/transformers'; | ||
// Specify a custom location for models (defaults to '/models/'). | ||
env.localModelPath = '/path/to/models/'; | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.