Replies: 1 comment
-
Hi @Adenialzz First, you can get from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('eva02_large_patch14_448.mim_m38m_ft_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5) Second, you can reproduce the evaluation result of the pre-trained model by using evaluation scriptpython3 validate.py /path/to/imageNet -m eva02_large_patch14_448.mim_m38m_ft_in1k --crop-pct 1.0 --input-size 3 448 448 --interpolation bicubic -b 64 --amp --channels-last --pin-mem evaluation loghankyul@hankyul:~/timm$ python3 validate.py imageNet -m eva02_large_patch14_448.mim_m38m_ft_in1k --cuda 4 --crop-pct 1.0 --input-size 3 448 448 --interpolation bicubic -b 64 --amp --channels-last --pin-mem Validating in mixed precision with native PyTorch AMP.
Loading pretrained weights from Hugging Face hub (timm/eva02_large_patch14_448.mim_m38m_ft_in1k)
[timm/eva02_large_patch14_448.mim_m38m_ft_in1k] Safe alternative available for 'pytorch_model.bin' (as 'model.safetensors'). Loading weights using safetensors.
Model eva02_large_patch14_448.mim_m38m_ft_in1k created, param count: 305080232
Data processing configuration for current model + dataset:
input_size: (3, 448, 448)
interpolation: bicubic
mean: (0.48145466, 0.4578275, 0.40821073)
std: (0.26862954, 0.26130258, 0.27577711)
crop_pct: 1.0
crop_mode: center
Test: [ 0/782] Time: 3.439s (3.439s, 18.61/s) Loss: 0.3269 (0.3269) Acc@1: 98.438 ( 98.438) Acc@5: 100.000 (100.000)
Test: [ 10/782] Time: 1.857s (1.999s, 32.02/s) Loss: 0.2588 (0.4114) Acc@1: 100.000 ( 96.023) Acc@5: 100.000 ( 99.432)
Test: [ 20/782] Time: 1.870s (1.935s, 33.08/s) Loss: 0.8745 (0.3997) Acc@1: 76.562 ( 96.057) Acc@5: 98.438 ( 99.554)
Test: [ 30/782] Time: 1.871s (1.913s, 33.45/s) Loss: 0.4209 (0.5076) Acc@1: 95.312 ( 92.087) Acc@5: 100.000 ( 99.546)
Test: [ 40/782] Time: 1.874s (1.904s, 33.62/s) Loss: 0.3896 (0.5260) Acc@1: 96.875 ( 91.616) Acc@5: 98.438 ( 99.352)
Test: [ 50/782] Time: 1.879s (1.899s, 33.71/s) Loss: 0.7354 (0.5859) Acc@1: 84.375 ( 89.890) Acc@5: 100.000 ( 99.142)
Test: [ 60/782] Time: 1.881s (1.895s, 33.76/s) Loss: 1.0996 (0.6133) Acc@1: 78.125 ( 88.883) Acc@5: 95.312 ( 98.975)
Test: [ 70/782] Time: 1.879s (1.893s, 33.80/s) Loss: 0.2761 (0.5966) Acc@1: 100.000 ( 89.723) Acc@5: 100.000 ( 98.944)
Test: [ 80/782] Time: 1.883s (1.892s, 33.83/s) Loss: 0.3113 (0.5700) Acc@1: 96.875 ( 90.567) Acc@5: 100.000 ( 99.016)
Test: [ 90/782] Time: 1.883s (1.891s, 33.85/s) Loss: 0.3848 (0.5588) Acc@1: 95.312 ( 90.986) Acc@5: 100.000 ( 99.056)
Test: [ 100/782] Time: 1.880s (1.890s, 33.87/s) Loss: 0.3621 (0.5573) Acc@1: 95.312 ( 90.981) Acc@5: 100.000 ( 99.118)
Test: [ 110/782] Time: 1.880s (1.889s, 33.88/s) Loss: 0.3125 (0.5390) Acc@1: 98.438 ( 91.470) Acc@5: 100.000 ( 99.198)
Test: [ 120/782] Time: 1.880s (1.888s, 33.90/s) Loss: 0.6685 (0.5274) Acc@1: 89.062 ( 91.865) Acc@5: 96.875 ( 99.212)
Test: [ 130/782] Time: 1.878s (1.887s, 33.91/s) Loss: 1.4229 (0.5418) Acc@1: 62.500 ( 91.436) Acc@5: 95.312 ( 99.141)
Test: [ 140/782] Time: 1.879s (1.887s, 33.92/s) Loss: 0.8267 (0.5459) Acc@1: 82.812 ( 91.434) Acc@5: 98.438 ( 99.047)
Test: [ 150/782] Time: 1.879s (1.886s, 33.93/s) Loss: 0.6055 (0.5476) Acc@1: 89.062 ( 91.442) Acc@5: 98.438 ( 98.986)
Test: [ 160/782] Time: 1.878s (1.886s, 33.94/s) Loss: 0.6680 (0.5523) Acc@1: 81.250 ( 91.266) Acc@5: 100.000 ( 98.952)
Test: [ 170/782] Time: 1.880s (1.885s, 33.94/s) Loss: 0.2534 (0.5447) Acc@1: 98.438 ( 91.475) Acc@5: 100.000 ( 98.949)
Test: [ 180/782] Time: 1.880s (1.885s, 33.95/s) Loss: 0.7998 (0.5511) Acc@1: 78.125 ( 91.342) Acc@5: 98.438 ( 98.921)
Test: [ 190/782] Time: 1.879s (1.885s, 33.96/s) Loss: 0.3928 (0.5560) Acc@1: 96.875 ( 91.059) Acc@5: 100.000 ( 98.945)
Test: [ 200/782] Time: 1.879s (1.884s, 33.96/s) Loss: 0.4788 (0.5552) Acc@1: 96.875 ( 91.053) Acc@5: 96.875 ( 98.935)
Test: [ 210/782] Time: 1.879s (1.884s, 33.97/s) Loss: 0.6338 (0.5514) Acc@1: 92.188 ( 91.121) Acc@5: 100.000 ( 98.963)
Test: [ 220/782] Time: 1.879s (1.884s, 33.97/s) Loss: 1.5166 (0.5689) Acc@1: 42.188 ( 90.597) Acc@5: 98.438 ( 98.890)
Test: [ 230/782] Time: 1.876s (1.884s, 33.97/s) Loss: 0.3643 (0.5705) Acc@1: 96.875 ( 90.652) Acc@5: 100.000 ( 98.884)
Test: [ 240/782] Time: 1.879s (1.884s, 33.98/s) Loss: 0.2842 (0.5715) Acc@1: 98.438 ( 90.612) Acc@5: 100.000 ( 98.891)
Test: [ 250/782] Time: 1.877s (1.883s, 33.98/s) Loss: 0.4243 (0.5724) Acc@1: 96.875 ( 90.563) Acc@5: 100.000 ( 98.873)
Test: [ 260/782] Time: 1.878s (1.883s, 33.99/s) Loss: 0.2446 (0.5654) Acc@1: 100.000 ( 90.793) Acc@5: 100.000 ( 98.892)
Test: [ 270/782] Time: 1.876s (1.883s, 33.99/s) Loss: 0.6689 (0.5626) Acc@1: 82.812 ( 90.838) Acc@5: 98.438 ( 98.922)
Test: [ 280/782] Time: 1.877s (1.883s, 33.99/s) Loss: 1.1943 (0.5664) Acc@1: 67.188 ( 90.664) Acc@5: 98.438 ( 98.949)
Test: [ 290/782] Time: 1.877s (1.882s, 34.00/s) Loss: 0.9761 (0.5670) Acc@1: 81.250 ( 90.732) Acc@5: 96.875 ( 98.931)
Test: [ 300/782] Time: 1.876s (1.882s, 34.00/s) Loss: 0.4758 (0.5758) Acc@1: 92.188 ( 90.568) Acc@5: 100.000 ( 98.858)
Test: [ 310/782] Time: 1.876s (1.882s, 34.00/s) Loss: 0.4087 (0.5732) Acc@1: 98.438 ( 90.655) Acc@5: 98.438 ( 98.880)
Test: [ 320/782] Time: 1.877s (1.882s, 34.01/s) Loss: 0.4944 (0.5756) Acc@1: 87.500 ( 90.606) Acc@5: 100.000 ( 98.890)
Test: [ 330/782] Time: 1.877s (1.882s, 34.01/s) Loss: 0.7588 (0.5806) Acc@1: 79.688 ( 90.450) Acc@5: 100.000 ( 98.881)
Test: [ 340/782] Time: 1.877s (1.882s, 34.01/s) Loss: 1.0723 (0.5793) Acc@1: 71.875 ( 90.478) Acc@5: 93.750 ( 98.882)
Test: [ 350/782] Time: 1.878s (1.882s, 34.01/s) Loss: 0.4590 (0.5800) Acc@1: 96.875 ( 90.500) Acc@5: 98.438 ( 98.883)
Test: [ 360/782] Time: 1.878s (1.881s, 34.02/s) Loss: 1.0312 (0.5788) Acc@1: 60.938 ( 90.504) Acc@5: 100.000 ( 98.896)
Test: [ 370/782] Time: 1.878s (1.881s, 34.02/s) Loss: 0.5200 (0.5804) Acc@1: 92.188 ( 90.473) Acc@5: 98.438 ( 98.884)
Test: [ 380/782] Time: 1.875s (1.881s, 34.02/s) Loss: 0.6211 (0.5864) Acc@1: 92.188 ( 90.248) Acc@5: 95.312 ( 98.876)
Test: [ 390/782] Time: 1.878s (1.881s, 34.02/s) Loss: 0.5552 (0.5915) Acc@1: 90.625 ( 90.102) Acc@5: 98.438 ( 98.865)
Test: [ 400/782] Time: 1.877s (1.881s, 34.02/s) Loss: 0.4075 (0.5945) Acc@1: 96.875 ( 90.056) Acc@5: 100.000 ( 98.843)
Test: [ 410/782] Time: 1.876s (1.881s, 34.03/s) Loss: 0.5981 (0.5961) Acc@1: 87.500 ( 89.986) Acc@5: 98.438 ( 98.837)
Test: [ 420/782] Time: 1.878s (1.881s, 34.03/s) Loss: 0.5879 (0.6000) Acc@1: 92.188 ( 89.872) Acc@5: 100.000 ( 98.838)
Test: [ 430/782] Time: 1.879s (1.881s, 34.03/s) Loss: 0.5288 (0.5997) Acc@1: 89.062 ( 89.882) Acc@5: 100.000 ( 98.851)
Test: [ 440/782] Time: 1.879s (1.881s, 34.03/s) Loss: 0.4360 (0.5981) Acc@1: 92.188 ( 89.955) Acc@5: 100.000 ( 98.852)
Test: [ 450/782] Time: 1.876s (1.881s, 34.03/s) Loss: 0.5425 (0.5955) Acc@1: 93.750 ( 90.057) Acc@5: 98.438 ( 98.857)
Test: [ 460/782] Time: 1.877s (1.881s, 34.03/s) Loss: 0.4346 (0.5960) Acc@1: 93.750 ( 90.049) Acc@5: 98.438 ( 98.865)
Test: [ 470/782] Time: 1.877s (1.880s, 34.03/s) Loss: 0.4683 (0.5970) Acc@1: 95.312 ( 90.061) Acc@5: 100.000 ( 98.875)
Test: [ 480/782] Time: 1.877s (1.880s, 34.04/s) Loss: 0.4431 (0.5928) Acc@1: 96.875 ( 90.196) Acc@5: 98.438 ( 98.886)
Test: [ 490/782] Time: 1.875s (1.880s, 34.04/s) Loss: 0.4404 (0.5973) Acc@1: 96.875 ( 90.043) Acc@5: 100.000 ( 98.867)
Test: [ 500/782] Time: 1.878s (1.880s, 34.04/s) Loss: 0.3804 (0.5997) Acc@1: 96.875 ( 89.886) Acc@5: 100.000 ( 98.868)
Test: [ 510/782] Time: 1.877s (1.880s, 34.04/s) Loss: 0.5693 (0.5983) Acc@1: 92.188 ( 89.940) Acc@5: 98.438 ( 98.875)
Test: [ 520/782] Time: 1.877s (1.880s, 34.04/s) Loss: 0.3132 (0.6016) Acc@1: 98.438 ( 89.827) Acc@5: 100.000 ( 98.869)
Test: [ 530/782] Time: 1.879s (1.880s, 34.04/s) Loss: 0.3337 (0.6007) Acc@1: 98.438 ( 89.833) Acc@5: 100.000 ( 98.879)
Test: [ 540/782] Time: 1.879s (1.880s, 34.04/s) Loss: 0.8218 (0.6020) Acc@1: 79.688 ( 89.782) Acc@5: 100.000 ( 98.885)
Test: [ 550/782] Time: 1.876s (1.880s, 34.04/s) Loss: 0.4807 (0.6010) Acc@1: 93.750 ( 89.817) Acc@5: 100.000 ( 98.891)
Test: [ 560/782] Time: 1.879s (1.880s, 34.04/s) Loss: 0.6401 (0.6013) Acc@1: 87.500 ( 89.820) Acc@5: 100.000 ( 98.897)
Test: [ 570/782] Time: 1.876s (1.880s, 34.04/s) Loss: 0.7256 (0.6017) Acc@1: 89.062 ( 89.842) Acc@5: 96.875 ( 98.886)
Test: [ 580/782] Time: 1.879s (1.880s, 34.04/s) Loss: 0.8916 (0.6013) Acc@1: 78.125 ( 89.872) Acc@5: 98.438 ( 98.887)
Test: [ 590/782] Time: 1.876s (1.880s, 34.05/s) Loss: 0.4912 (0.6021) Acc@1: 96.875 ( 89.834) Acc@5: 96.875 ( 98.890)
Test: [ 600/782] Time: 1.877s (1.880s, 34.05/s) Loss: 0.3984 (0.6026) Acc@1: 96.875 ( 89.842) Acc@5: 100.000 ( 98.872)
Test: [ 610/782] Time: 1.878s (1.880s, 34.05/s) Loss: 0.4519 (0.6007) Acc@1: 93.750 ( 89.914) Acc@5: 100.000 ( 98.885)
Test: [ 620/782] Time: 1.877s (1.880s, 34.05/s) Loss: 0.5835 (0.6008) Acc@1: 90.625 ( 89.903) Acc@5: 100.000 ( 98.898)
Test: [ 630/782] Time: 1.879s (1.880s, 34.05/s) Loss: 0.4021 (0.6001) Acc@1: 96.875 ( 89.954) Acc@5: 100.000 ( 98.896)
Test: [ 640/782] Time: 1.878s (1.880s, 34.05/s) Loss: 0.6147 (0.6039) Acc@1: 92.188 ( 89.813) Acc@5: 98.438 ( 98.896)
Test: [ 650/782] Time: 1.878s (1.880s, 34.05/s) Loss: 0.3767 (0.6030) Acc@1: 98.438 ( 89.850) Acc@5: 100.000 ( 98.901)
Test: [ 660/782] Time: 1.879s (1.880s, 34.05/s) Loss: 0.5391 (0.6058) Acc@1: 93.750 ( 89.746) Acc@5: 98.438 ( 98.891)
Test: [ 670/782] Time: 1.876s (1.880s, 34.05/s) Loss: 0.6616 (0.6081) Acc@1: 89.062 ( 89.659) Acc@5: 100.000 ( 98.889)
Test: [ 680/782] Time: 1.876s (1.880s, 34.05/s) Loss: 0.3928 (0.6064) Acc@1: 96.875 ( 89.710) Acc@5: 100.000 ( 98.894)
Test: [ 690/782] Time: 1.877s (1.880s, 34.05/s) Loss: 0.6396 (0.6060) Acc@1: 90.625 ( 89.716) Acc@5: 100.000 ( 98.892)
Test: [ 700/782] Time: 1.877s (1.879s, 34.05/s) Loss: 0.4375 (0.6056) Acc@1: 93.750 ( 89.720) Acc@5: 98.438 ( 98.888)
Test: [ 710/782] Time: 1.876s (1.879s, 34.05/s) Loss: 0.6377 (0.6085) Acc@1: 87.500 ( 89.638) Acc@5: 100.000 ( 98.890)
Test: [ 720/782] Time: 1.875s (1.879s, 34.05/s) Loss: 0.4829 (0.6082) Acc@1: 93.750 ( 89.661) Acc@5: 100.000 ( 98.901)
Test: [ 730/782] Time: 1.878s (1.879s, 34.05/s) Loss: 0.6636 (0.6081) Acc@1: 87.500 ( 89.652) Acc@5: 96.875 ( 98.906)
Test: [ 740/782] Time: 1.877s (1.879s, 34.05/s) Loss: 1.0986 (0.6078) Acc@1: 67.188 ( 89.651) Acc@5: 100.000 ( 98.912)
Test: [ 750/782] Time: 1.877s (1.879s, 34.05/s) Loss: 1.3535 (0.6073) Acc@1: 64.062 ( 89.649) Acc@5: 98.438 ( 98.920)
Test: [ 760/782] Time: 1.879s (1.879s, 34.05/s) Loss: 0.6895 (0.6100) Acc@1: 87.500 ( 89.576) Acc@5: 100.000 ( 98.924)
Test: [ 770/782] Time: 1.878s (1.879s, 34.06/s) Loss: 0.2578 (0.6107) Acc@1: 100.000 ( 89.575) Acc@5: 100.000 ( 98.924)
Test: [ 780/782] Time: 1.878s (1.879s, 34.06/s) Loss: 1.0439 (0.6100) Acc@1: 73.438 ( 89.579) Acc@5: 100.000 ( 98.922)
* Acc@1 89.578 (10.422) Acc@5 98.922 (1.078)
--result
{
"model": "eva02_large_patch14_448.mim_m38m_ft_in1k",
"top1": 89.578,
"top1_err": 10.422,
"top5": 98.922,
"top5_err": 1.078,
"param_count": 305.08,
"img_size": 448,
"cropt_pct": 1.0,
"interpolation": "bicubic"
} I hope this answer could help you. Thank you. Hankyul |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I want to reproduce the evaluating results showed in README.md, but I do not know the related transform for the pretrained model. I notice that there are configs in huggingface config file (like this). How can I build transform through this config file?
Beta Was this translation helpful? Give feedback.
All reactions