ai-forever / RUDOLPH-2.7B

huggingface.co
Total runs: 0
24-hour runs: 0
7-day runs: 0
30-day runs: 0
Model's Last Updated: 2022年10月6日

Introduction of RUDOLPH-2.7B

Model Details of RUDOLPH-2.7B

RUDOLPH-2.7B (XL)

RUDOLPH: One Hyper-Tasking Transformer Can be Creative as DALL-E and GPT-3 and Smart as CLIP

Model was trained by Sber AI team.

Model Description

RU ssian D ecoder O n L anguage P icture H yper-tasking ( RUDOLPH ) 2.7B is the largest text-image-text transformer designed for an easy fine-tuning for a range of tasks: from generating images by text description and image classification to visual question answering and more. This model demonstrates the power of Hyper-tasking Transformers.

Hyper-tasking model is a generalized multi-tasking model, i.e., the model that can solve almost all tasks within supported modalities, mandatory including mutual pairwise translations between modalities (two modalities in case of RUDOLPH: images and Russian texts).

  • Tasks: text2image generation, self reranking, text ranking, image ranking, image2text generation, zero-shot image classification, text2text generation, text qa, math qa, image captioning, image generation, text recognition in the wild, visual qa, and so on
  • Language: Russian
  • Type: decoder
  • Num Parameters: 2.7B
  • Training Data Volume: 119 million text-image pairs, 60 million text paragraphs

Details of architecture

The maximum sequence length that this model may be used with depends on the modality and stands for 384 - 576 - 128 for the left text tokens, image tokens, and right text tokens, respectively.

RUDOLPH 2.7B is a Transformer-based decoder model with the following parameters:

  • num_layers (32) — Number of hidden layers in the Transformer decoder.
  • hidden_size (2560) — Dimensionality of the hidden layers.
  • num_attention_heads (32) — Number of attention heads for each attention layer.

Sparse Attention Masks

The primary proposed method is to modify the sparse transformer's attention mask to better control modalities. It allows us to calculate the transitions of modalities in both directions, unlike another similar work DALL-E Transformer, which used only one direction, "text to image". The proposed "image to right text" direction is achieved by extension sparse attention mask to the right for auto-repressively text generation with both image and left text condition.

Authors

Runs of ai-forever RUDOLPH-2.7B on huggingface.co

0
Total runs
0
24-hour runs
0
3-day runs
0
7-day runs
0
30-day runs

More Information About RUDOLPH-2.7B huggingface.co Model

RUDOLPH-2.7B huggingface.co

RUDOLPH-2.7B huggingface.co is an AI model on huggingface.co that provides RUDOLPH-2.7B's model effect (), which can be used instantly with this ai-forever RUDOLPH-2.7B model. huggingface.co supports a free trial of the RUDOLPH-2.7B model, and also provides paid use of the RUDOLPH-2.7B. Support call RUDOLPH-2.7B model through api, including Node.js, Python, http.

ai-forever RUDOLPH-2.7B online free

RUDOLPH-2.7B huggingface.co is an online trial and call api platform, which integrates RUDOLPH-2.7B's modeling effects, including api services, and provides a free online trial of RUDOLPH-2.7B, you can try RUDOLPH-2.7B online for free by clicking the link below.

ai-forever RUDOLPH-2.7B online free url in huggingface.co:

https://huggingface.co/ai-forever/RUDOLPH-2.7B

RUDOLPH-2.7B install

RUDOLPH-2.7B is an open source model from GitHub that offers a free installation service, and any user can find RUDOLPH-2.7B on GitHub to install. At the same time, huggingface.co provides the effect of RUDOLPH-2.7B install, users can directly use RUDOLPH-2.7B installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

RUDOLPH-2.7B install url in huggingface.co:

https://huggingface.co/ai-forever/RUDOLPH-2.7B

Url of RUDOLPH-2.7B

RUDOLPH-2.7B huggingface.co Url

Provider of RUDOLPH-2.7B huggingface.co

ai-forever
ORGANIZATIONS

Other API from ai-forever

huggingface.co

Total runs: 525.5K
Run Growth: 507.4K
Growth Rate: 96.56%
Updated: 2023年11月3日
huggingface.co

Total runs: 10.6K
Run Growth: 1.5K
Growth Rate: 13.75%
Updated: 2023年12月5日
huggingface.co

Total runs: 8.2K
Run Growth: 5.1K
Growth Rate: 59.45%
Updated: 2024年12月29日
huggingface.co

Total runs: 5.9K
Run Growth: 3.5K
Growth Rate: 60.26%
Updated: 2023年12月11日
huggingface.co

Total runs: 2.3K
Run Growth: -408
Growth Rate: -17.86%
Updated: 2023年12月5日
huggingface.co

Total runs: 1.8K
Run Growth: 5
Growth Rate: 0.28%
Updated: 2023年12月28日
huggingface.co

Total runs: 1.3K
Run Growth: -6.3K
Growth Rate: -493.39%
Updated: 2023年12月5日
huggingface.co

Total runs: 315
Run Growth: 158
Growth Rate: 50.16%
Updated: 2023年1月26日
huggingface.co

Total runs: 0
Run Growth: 0
Growth Rate: 0.00%
Updated: 2021年12月24日
huggingface.co

Total runs: 0
Run Growth: 0
Growth Rate: 0.00%
Updated: 2023年6月8日
huggingface.co

Total runs: 0
Run Growth: 0
Growth Rate: 0.00%
Updated: 2021年9月21日