ai-forever / RUDOLPH-1.3B

huggingface.co
Total runs: 0
24-hour runs: 0
7-day runs: 0
30-day runs: 0
Model's Last Updated: 10월 06 2022

Introduction of RUDOLPH-1.3B

Model Details of RUDOLPH-1.3B

RUDOLPH-1.3B (Large)

RUDOLPH: One Hyper-Tasking Transformer Can be Creative as DALL-E and GPT-3 and Smart as CLIP

Model was trained by Sber AI team.

Model Description

RU ssian D ecoder O n L anguage P icture H yper-tasking ( RUDOLPH ) 1.3B is a large text-image-text transformer designed for an easy fine-tuning for a range of tasks: from generating images by text description and image classification to visual question answering and more. This model demonstrates the power of Hyper-tasking Transformers.

Hyper-tasking model is a generalized multi-tasking model, i.e., the model that can solve almost all tasks within supported modalities, mandatory including mutual pairwise translations between modalities (two modalities in case of RUDOLPH: images and Russian texts).

  • Tasks: text2image generation, self reranking, text ranking, image ranking, image2text generation, zero-shot image classification, text2text generation, text-qa, math-qa, image captioning, image generation, text-in-the-wild, visual qa, and so on
  • Language: Russian
  • Type: decoder
  • Num Parameters: 1.3B
  • Training Data Volume: 141 million text-image pairs, 60 million text paragraphs

Details of architecture

The maximum sequence length that this model may be used with depends on the modality and stands for 128 - 1024 - 128 for the left text tokens, image tokens, and right text tokens, respectively.

RUDOLPH 1.3B is a Transformer-based decoder model with the following parameters:

  • num_layers (24) — Number of hidden layers in the Transformer decoder.
  • hidden_size (2048) — Dimensionality of the hidden layers.
  • num_attention_heads (16) — Number of attention heads for each attention layer.

Sparse Attention Mask

The primary proposed method is to modify the sparse transformer's attention mask to better control modalities. It allows us to calculate the transitions of modalities in both directions, unlike another similar work DALL-E Transformer, which used only one direction, "text to image". The proposed "image to right text" direction is achieved by extension sparse attention mask to the right for auto-repressively text generation with both image and left text condition.

Authors

Runs of ai-forever RUDOLPH-1.3B on huggingface.co

0
Total runs
0
24-hour runs
0
3-day runs
0
7-day runs
0
30-day runs

More Information About RUDOLPH-1.3B huggingface.co Model

RUDOLPH-1.3B huggingface.co

RUDOLPH-1.3B huggingface.co is an AI model on huggingface.co that provides RUDOLPH-1.3B's model effect (), which can be used instantly with this ai-forever RUDOLPH-1.3B model. huggingface.co supports a free trial of the RUDOLPH-1.3B model, and also provides paid use of the RUDOLPH-1.3B. Support call RUDOLPH-1.3B model through api, including Node.js, Python, http.

ai-forever RUDOLPH-1.3B online free

RUDOLPH-1.3B huggingface.co is an online trial and call api platform, which integrates RUDOLPH-1.3B's modeling effects, including api services, and provides a free online trial of RUDOLPH-1.3B, you can try RUDOLPH-1.3B online for free by clicking the link below.

ai-forever RUDOLPH-1.3B online free url in huggingface.co:

https://huggingface.co/ai-forever/RUDOLPH-1.3B

RUDOLPH-1.3B install

RUDOLPH-1.3B is an open source model from GitHub that offers a free installation service, and any user can find RUDOLPH-1.3B on GitHub to install. At the same time, huggingface.co provides the effect of RUDOLPH-1.3B install, users can directly use RUDOLPH-1.3B installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

RUDOLPH-1.3B install url in huggingface.co:

https://huggingface.co/ai-forever/RUDOLPH-1.3B

Url of RUDOLPH-1.3B

RUDOLPH-1.3B huggingface.co Url

Provider of RUDOLPH-1.3B huggingface.co

ai-forever
ORGANIZATIONS

Other API from ai-forever

huggingface.co

Total runs: 525.5K
Run Growth: 485.9K
Growth Rate: 96.59%
Updated: 11월 03 2023
huggingface.co

Total runs: 10.6K
Run Growth: 1.6K
Growth Rate: 14.99%
Updated: 12월 05 2023
huggingface.co

Total runs: 8.2K
Run Growth: 5.1K
Growth Rate: 59.45%
Updated: 12월 29 2024
huggingface.co

Total runs: 5.9K
Run Growth: 3.6K
Growth Rate: 61.73%
Updated: 12월 11 2023
huggingface.co

Total runs: 2.3K
Run Growth: -360
Growth Rate: -15.50%
Updated: 12월 05 2023
huggingface.co

Total runs: 315
Run Growth: 165
Growth Rate: 52.05%
Updated: 1월 26 2023
huggingface.co

Total runs: 0
Run Growth: 0
Growth Rate: 0.00%
Updated: 12월 24 2021
huggingface.co

Total runs: 0
Run Growth: 0
Growth Rate: 0.00%
Updated: 6월 08 2023
huggingface.co

Total runs: 0
Run Growth: 0
Growth Rate: 0.00%
Updated: 9월 21 2021