Skip to content

CUDA: remove incorrect precision check #7454

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged

Conversation

JohannesGaessler
Copy link
Collaborator

Fixes the issue described in #7314 (comment) . The problem is that there is an assert for precision in the FP32 code even though that check should only be in the FP16 code.

@github-actions github-actions bot added Nvidia GPU Issues specific to Nvidia GPUs ggml changes relating to the ggml tensor library for machine learning labels May 22, 2024
Copy link
Contributor

📈 llama.cpp server for bench-server-baseline on Standard_NC4as_T4_v3 for phi-2-q4_0: 535 iterations 🚀

Expand details for performance related PR only
  • Concurrent users: 8, duration: 10m
  • HTTP request : avg=8755.19ms p(95)=22304.41ms fails=, finish reason: stop=474 truncated=61
  • Prompt processing (pp): avg=98.25tk/s p(95)=408.86tk/s
  • Token generation (tg): avg=37.68tk/s p(95)=48.68tk/s
  • ggml-org/models/phi-2/ggml-model-q4_0.gguf parallel=8 ctx-size=16384 ngl=33 batch-size=2048 ubatch-size=256 pp=1024 pp+tg=2048 branch=cuda-fix-fat32-prec commit=5cf3c42412b8c496907348cae058b65b39fc2906

prompt_tokens_seconds

More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 535 iterations"
    y-axis "llamacpp:prompt_tokens_seconds"
    x-axis "llamacpp:prompt_tokens_seconds" 1716364813 --> 1716365445
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 668.99, 668.99, 668.99, 668.99, 668.99, 856.01, 856.01, 856.01, 856.01, 856.01, 875.37, 875.37, 875.37, 875.37, 875.37, 943.4, 943.4, 943.4, 943.4, 943.4, 899.14, 899.14, 899.14, 899.14, 899.14, 894.22, 894.22, 894.22, 894.22, 894.22, 917.38, 917.38, 917.38, 917.38, 917.38, 910.83, 910.83, 910.83, 910.83, 910.83, 917.84, 917.84, 917.84, 917.84, 917.84, 914.69, 914.69, 914.69, 914.69, 914.69, 922.87, 922.87, 922.87, 922.87, 922.87, 882.6, 882.6, 882.6, 882.6, 882.6, 827.06, 827.06, 827.06, 827.06, 827.06, 844.44, 844.44, 844.44, 844.44, 844.44, 846.56, 846.56, 846.56, 846.56, 846.56, 841.87, 841.87, 841.87, 841.87, 841.87, 842.04, 842.04, 842.04, 842.04, 842.04, 862.35, 862.35, 862.35, 862.35, 862.35, 865.21, 865.21, 865.21, 865.21, 865.21, 872.86, 872.86, 872.86, 872.86, 872.86, 872.57, 872.57, 872.57, 872.57, 872.57, 873.09, 873.09, 873.09, 873.09, 873.09, 889.93, 889.93, 889.93, 889.93, 889.93, 888.58, 888.58, 888.58, 888.58, 888.58, 890.63, 890.63, 890.63, 890.63, 890.63, 903.81, 903.81, 903.81, 903.81, 903.81, 900.94, 900.94, 900.94, 900.94, 900.94, 896.97, 896.97, 896.97, 896.97, 896.97, 896.15, 896.15, 896.15, 896.15, 896.15, 900.11, 900.11, 900.11, 900.11, 900.11, 898.37, 898.37, 898.37, 898.37, 898.37, 898.0, 898.0, 898.0, 898.0, 898.0, 899.08, 899.08, 899.08, 899.08, 899.08, 909.69, 909.69, 909.69, 909.69, 909.69, 911.45, 911.45, 911.45, 911.45, 911.45, 914.0, 914.0, 914.0, 914.0, 914.0, 899.13, 899.13, 899.13, 899.13, 899.13, 894.86, 894.86, 894.86, 894.86, 894.86, 893.62, 893.62, 893.62, 893.62, 893.62, 894.22, 894.22, 894.22, 894.22, 894.22, 898.8, 898.8, 898.8, 898.8, 898.8, 903.43, 903.43, 903.43, 903.43, 903.43, 900.62, 900.62, 900.62, 900.62, 900.62, 897.37, 897.37, 897.37, 897.37, 897.37, 894.87, 894.87, 894.87, 894.87, 894.87, 892.03, 892.03, 892.03, 892.03, 892.03, 893.81, 893.81, 893.81, 893.81, 893.81, 895.15, 895.15, 895.15, 895.15, 895.15, 893.05, 893.05, 893.05, 893.05, 893.05, 895.92, 895.92, 895.92, 895.92, 895.92, 894.18, 894.18, 894.18, 894.18, 894.18, 896.78, 896.78, 896.78, 896.78, 896.78, 898.08, 898.08, 898.08, 898.08, 898.08, 897.39, 897.39, 897.39, 897.39, 897.39, 901.01, 901.01, 901.01, 901.01, 901.01, 899.88, 899.88, 899.88, 899.88, 899.88, 897.93, 897.93, 897.93, 897.93, 897.93, 896.56, 896.56, 896.56, 896.56, 896.56, 894.07, 894.07, 894.07, 894.07, 894.07, 894.91, 894.91, 894.91, 894.91, 894.91, 894.97, 894.97]
                    
Loading
predicted_tokens_seconds
More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 535 iterations"
    y-axis "llamacpp:predicted_tokens_seconds"
    x-axis "llamacpp:predicted_tokens_seconds" 1716364813 --> 1716365445
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 41.81, 41.81, 41.81, 41.81, 41.81, 27.36, 27.36, 27.36, 27.36, 27.36, 30.38, 30.38, 30.38, 30.38, 30.38, 31.33, 31.33, 31.33, 31.33, 31.33, 32.88, 32.88, 32.88, 32.88, 32.88, 34.43, 34.43, 34.43, 34.43, 34.43, 35.11, 35.11, 35.11, 35.11, 35.11, 34.7, 34.7, 34.7, 34.7, 34.7, 34.2, 34.2, 34.2, 34.2, 34.2, 33.72, 33.72, 33.72, 33.72, 33.72, 33.59, 33.59, 33.59, 33.59, 33.59, 33.6, 33.6, 33.6, 33.6, 33.6, 33.28, 33.28, 33.28, 33.28, 33.28, 31.91, 31.91, 31.91, 31.91, 31.91, 30.52, 30.52, 30.52, 30.52, 30.52, 29.98, 29.98, 29.98, 29.98, 29.98, 30.03, 30.03, 30.03, 30.03, 30.03, 30.32, 30.32, 30.32, 30.32, 30.32, 30.04, 30.04, 30.04, 30.04, 30.04, 30.27, 30.27, 30.27, 30.27, 30.27, 30.34, 30.34, 30.34, 30.34, 30.34, 30.67, 30.67, 30.67, 30.67, 30.67, 30.5, 30.5, 30.5, 30.5, 30.5, 30.68, 30.68, 30.68, 30.68, 30.68, 30.9, 30.9, 30.9, 30.9, 30.9, 30.9, 30.9, 30.9, 30.9, 30.9, 30.51, 30.51, 30.51, 30.51, 30.51, 30.45, 30.45, 30.45, 30.45, 30.45, 30.64, 30.64, 30.64, 30.64, 30.64, 30.79, 30.79, 30.79, 30.79, 30.79, 30.86, 30.86, 30.86, 30.86, 30.86, 31.05, 31.05, 31.05, 31.05, 31.05, 31.13, 31.13, 31.13, 31.13, 31.13, 30.95, 30.95, 30.95, 30.95, 30.95, 30.81, 30.81, 30.81, 30.81, 30.81, 30.42, 30.42, 30.42, 30.42, 30.42, 30.26, 30.26, 30.26, 30.26, 30.26, 30.28, 30.28, 30.28, 30.28, 30.28, 30.28, 30.28, 30.28, 30.28, 30.28, 30.46, 30.46, 30.46, 30.46, 30.46, 30.48, 30.48, 30.48, 30.48, 30.48, 30.41, 30.41, 30.41, 30.41, 30.41, 30.11, 30.11, 30.11, 30.11, 30.11, 29.87, 29.87, 29.87, 29.87, 29.87, 28.81, 28.81, 28.81, 28.81, 28.81, 28.62, 28.62, 28.62, 28.62, 28.62, 28.6, 28.6, 28.6, 28.6, 28.6, 28.55, 28.55, 28.55, 28.55, 28.55, 28.44, 28.44, 28.44, 28.44, 28.44, 28.48, 28.48, 28.48, 28.48, 28.48, 28.58, 28.58, 28.58, 28.58, 28.58, 28.63, 28.63, 28.63, 28.63, 28.63, 28.67, 28.67, 28.67, 28.67, 28.67, 28.53, 28.53, 28.53, 28.53, 28.53, 28.54, 28.54, 28.54, 28.54, 28.54, 28.46, 28.46, 28.46, 28.46, 28.46, 28.49, 28.49, 28.49, 28.49, 28.49, 28.68, 28.68, 28.68, 28.68, 28.68, 28.71, 28.71, 28.71, 28.71, 28.71, 28.82, 28.82, 28.82, 28.82, 28.82, 28.91, 28.91]
                    
Loading

Details

kv_cache_usage_ratio

More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 535 iterations"
    y-axis "llamacpp:kv_cache_usage_ratio"
    x-axis "llamacpp:kv_cache_usage_ratio" 1716364813 --> 1716365445
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.24, 0.24, 0.24, 0.24, 0.24, 0.22, 0.22, 0.22, 0.22, 0.22, 0.1, 0.1, 0.1, 0.1, 0.1, 0.17, 0.17, 0.17, 0.17, 0.17, 0.14, 0.14, 0.14, 0.14, 0.14, 0.12, 0.12, 0.12, 0.12, 0.12, 0.19, 0.19, 0.19, 0.19, 0.19, 0.25, 0.25, 0.25, 0.25, 0.25, 0.21, 0.21, 0.21, 0.21, 0.21, 0.19, 0.19, 0.19, 0.19, 0.19, 0.13, 0.13, 0.13, 0.13, 0.13, 0.21, 0.21, 0.21, 0.21, 0.21, 0.31, 0.31, 0.31, 0.31, 0.31, 0.36, 0.36, 0.36, 0.36, 0.36, 0.39, 0.39, 0.39, 0.39, 0.39, 0.17, 0.17, 0.17, 0.17, 0.17, 0.17, 0.17, 0.17, 0.17, 0.17, 0.31, 0.31, 0.31, 0.31, 0.31, 0.15, 0.15, 0.15, 0.15, 0.15, 0.18, 0.18, 0.18, 0.18, 0.18, 0.21, 0.21, 0.21, 0.21, 0.21, 0.15, 0.15, 0.15, 0.15, 0.15, 0.34, 0.34, 0.34, 0.34, 0.34, 0.11, 0.11, 0.11, 0.11, 0.11, 0.15, 0.15, 0.15, 0.15, 0.15, 0.29, 0.29, 0.29, 0.29, 0.29, 0.24, 0.24, 0.24, 0.24, 0.24, 0.17, 0.17, 0.17, 0.17, 0.17, 0.16, 0.16, 0.16, 0.16, 0.16, 0.15, 0.15, 0.15, 0.15, 0.15, 0.13, 0.13, 0.13, 0.13, 0.13, 0.16, 0.16, 0.16, 0.16, 0.16, 0.12, 0.12, 0.12, 0.12, 0.12, 0.32, 0.32, 0.32, 0.32, 0.32, 0.34, 0.34, 0.34, 0.34, 0.34, 0.34, 0.34, 0.34, 0.34, 0.34, 0.25, 0.25, 0.25, 0.25, 0.25, 0.22, 0.22, 0.22, 0.22, 0.22, 0.06, 0.06, 0.06, 0.06, 0.06, 0.1, 0.1, 0.1, 0.1, 0.1, 0.22, 0.22, 0.22, 0.22, 0.22, 0.41, 0.41, 0.41, 0.41, 0.41, 0.68, 0.68, 0.68, 0.68, 0.68, 0.57, 0.57, 0.57, 0.57, 0.57, 0.45, 0.45, 0.45, 0.45, 0.45, 0.18, 0.18, 0.18, 0.18, 0.18, 0.21, 0.21, 0.21, 0.21, 0.21, 0.3, 0.3, 0.3, 0.3, 0.3, 0.24, 0.24, 0.24, 0.24, 0.24, 0.19, 0.19, 0.19, 0.19, 0.19, 0.13, 0.13, 0.13, 0.13, 0.13, 0.2, 0.2, 0.2, 0.2, 0.2, 0.25, 0.25, 0.25, 0.25, 0.25, 0.17, 0.17, 0.17, 0.17, 0.17, 0.21, 0.21, 0.21, 0.21, 0.21, 0.1, 0.1, 0.1, 0.1, 0.1, 0.14, 0.14, 0.14, 0.14, 0.14, 0.19, 0.19, 0.19, 0.19, 0.19, 0.14, 0.14, 0.14, 0.14, 0.14, 0.17, 0.17, 0.17, 0.17, 0.17, 0.18, 0.18]
                    
Loading
requests_processing
More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 535 iterations"
    y-axis "llamacpp:requests_processing"
    x-axis "llamacpp:requests_processing" 1716364813 --> 1716365445
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 6.0, 6.0, 6.0, 6.0, 6.0, 5.0, 5.0, 5.0, 5.0, 5.0, 3.0, 3.0, 3.0, 3.0, 3.0, 1.0, 1.0, 1.0, 1.0, 1.0, 4.0, 4.0, 4.0, 4.0, 4.0, 2.0, 2.0, 2.0, 2.0, 2.0, 7.0, 7.0, 7.0, 7.0, 7.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 2.0, 2.0, 2.0, 2.0, 2.0, 6.0, 6.0, 6.0, 6.0, 6.0, 8.0, 8.0, 8.0, 8.0, 8.0, 7.0, 7.0, 7.0, 7.0, 7.0, 6.0, 6.0, 6.0, 6.0, 6.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 7.0, 7.0, 7.0, 7.0, 7.0, 3.0, 3.0, 3.0, 3.0, 3.0, 7.0, 7.0, 7.0, 7.0, 7.0, 5.0, 5.0, 5.0, 5.0, 5.0, 2.0, 2.0, 2.0, 2.0, 2.0, 5.0, 5.0, 5.0, 5.0, 5.0, 3.0, 3.0, 3.0, 3.0, 3.0, 4.0, 4.0, 4.0, 4.0, 4.0, 7.0, 7.0, 7.0, 7.0, 7.0, 6.0, 6.0, 6.0, 6.0, 6.0, 3.0, 3.0, 3.0, 3.0, 3.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 6.0, 6.0, 6.0, 6.0, 6.0, 4.0, 4.0, 4.0, 4.0, 4.0, 6.0, 6.0, 6.0, 6.0, 6.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 2.0, 2.0, 2.0, 2.0, 2.0, 4.0, 4.0, 4.0, 4.0, 4.0, 7.0, 7.0, 7.0, 7.0, 7.0, 8.0, 8.0, 8.0, 8.0, 8.0, 5.0, 5.0, 5.0, 5.0, 5.0, 7.0, 7.0, 7.0, 7.0, 7.0, 8.0, 8.0, 8.0, 8.0, 8.0, 7.0, 7.0, 7.0, 7.0, 7.0, 5.0, 5.0, 5.0, 5.0, 5.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 8.0, 8.0, 8.0, 8.0, 8.0, 7.0, 7.0, 7.0, 7.0, 7.0, 5.0, 5.0, 5.0, 5.0, 5.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 5.0, 5.0, 5.0, 5.0, 5.0, 8.0, 8.0, 8.0, 8.0, 8.0, 1.0, 1.0, 1.0, 1.0, 1.0, 7.0, 7.0, 7.0, 7.0, 7.0, 0.0, 0.0, 0.0, 0.0, 0.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 2.0, 2.0]
                    
Loading

@JohannesGaessler JohannesGaessler merged commit 95fb0ae into ggml-org:master May 22, 2024
63 of 74 checks passed
ggerganov added a commit that referenced this pull request May 22, 2024
teleprint-me pushed a commit to teleprint-me/llama.cpp that referenced this pull request May 23, 2024
teleprint-me pushed a commit to teleprint-me/llama.cpp that referenced this pull request May 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ggml changes relating to the ggml tensor library for machine learning Nvidia GPU Issues specific to Nvidia GPUs
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants