from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('resnest101e.in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
Feature Map Extraction
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'resnest101e.in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1for o in output:
# print shape of each feature map in output# e.g.:# torch.Size([1, 128, 128, 128])# torch.Size([1, 256, 64, 64])# torch.Size([1, 512, 32, 32])# torch.Size([1, 1024, 16, 16])# torch.Size([1, 2048, 8, 8])print(o.shape)
Image Embeddings
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'resnest101e.in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 2048, 8, 8) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
Model Comparison
Explore the dataset and runtime metrics of this model in timm
model results
.
Citation
@article{zhang2020resnest,
title={ResNeSt: Split-Attention Networks},
author={Zhang, Hang and Wu, Chongruo and Zhang, Zhongyue and Zhu, Yi and Zhang, Zhi and Lin, Haibin and Sun, Yue and He, Tong and Muller, Jonas and Manmatha, R. and Li, Mu and Smola, Alexander},
journal={arXiv preprint arXiv:2004.08955},
year={2020}
}
Runs of timm resnest101e.in1k on huggingface.co
20.5K
Total runs
0
24-hour runs
1.2K
3-day runs
399
7-day runs
5.8K
30-day runs
More Information About resnest101e.in1k huggingface.co Model
resnest101e.in1k huggingface.co is an AI model on huggingface.co that provides resnest101e.in1k's model effect (), which can be used instantly with this timm resnest101e.in1k model. huggingface.co supports a free trial of the resnest101e.in1k model, and also provides paid use of the resnest101e.in1k. Support call resnest101e.in1k model through api, including Node.js, Python, http.
resnest101e.in1k huggingface.co is an online trial and call api platform, which integrates resnest101e.in1k's modeling effects, including api services, and provides a free online trial of resnest101e.in1k, you can try resnest101e.in1k online for free by clicking the link below.
timm resnest101e.in1k online free url in huggingface.co:
resnest101e.in1k is an open source model from GitHub that offers a free installation service, and any user can find resnest101e.in1k on GitHub to install. At the same time, huggingface.co provides the effect of resnest101e.in1k install, users can directly use resnest101e.in1k installed effect in huggingface.co for debugging and trial. It also supports api for free installation.