git_update/test/performance_tests
2024-07-25 11:00:44 +02:00
..
performance_tests.sh test directory complete 2024-07-24 13:57:26 +02:00
readme test directory complete 2024-07-24 13:57:26 +02:00
README.md README.md plus jolis v2 2024-07-25 11:00:44 +02:00

Introduction

Our objective is to find the least-consuming method in terms of memory and bandwidth resources. We are interested in cloning one specific state of the repository. We are not interested in its history or the possibility to change it from the server where it has been cloned. The tests rely on a repository created by test_repo_creation.sh. It measures memory and bandwidth usage for different git commande.

Tests

The script consists of twenty-nine tests (listed in the results below), based on three functions: generate_random_file, get_storage_used and get_bandwidth.

generate_random_file uses the bash command dd and /dev/random. get_storage_used uses the bash command du. get_bandwidth retrieves the output of Git commands and extracts the traffic displayed. This does not take submodule traffic into account.

The first five tests concern cloning. The following tests involve updating the repository using different commands, with three cases for each command: after adding a file, after deleting a file, after adding then deleting a file.

Help extract

NAME
    performance_tests.sh
SYNOPSIS
    performance_tests.sh [-a] [-h] [-n number]
OPTIONS
    -a executes all the tests.
    -n number executes test number.
    -c cleans.
    -h prints the help. 
DESCRIPTION
    This script allows you to measure memory and bandwidth usage. The first five test different cloning methods. Te following apply changes to the local remote before testing fetching and merging commands.
    TEST0: classic cloning 
    TEST1: --single-branch cloning
    TEST2: --depth=1 --no-single-branch cloning
    TEST3: --depth=1 cloning
    TEST4: --depth=1 with reflog and gc cloning
    TEST5: sparse-checking 1M sample0 cloning
    _________________
    TEST6: classic fetching+checking out after addition of a 1M file
    TEST7: classic fetching+checking out after removal of a 1M file
    TEST8: classic fetching+checking out after addition then removal of a 1M file

    TEST9: --depth=1 fetching+checking out after addition of a 1M file
    TEST10: --depth=1 fetching+checking out after removal of a 1M file
    TEST11: --depth=1 fetching+checking out after addition then removal of 1M a file

    TEST12: --depth=1 fetching+checking out with reflog annd gc after addition of a 1M file
    TEST13: --depth=1 fetching+checking out with reflog annd gc after removal of a 1M file
    TEST14: --depth=1 fetching+checking out with reflog annd gc after addition then removal of a 1M file

    TEST15: --depth=1 fetching+ --reset=hard after addition of a 1M file
    TEST16: --depth=1 fetching+ --reset=hard after removal of a 1M file
    TEST17:  --depth=1 fetching+ --reset=hard after addition then removal of a 1M file

    TEST18: --depth=1 fetching+ --reset=hard and reflog and gc after addition of a 1M file
    TEST19: --depth=1 fetching+ --reset=hard and reflog and gc after removal of a 1M file
    TEST20: --depth=1 fetching+ --reset=hard and reflog and gc after addition then removal of a 1M file

    TEST21: --depth=1 fetching+checking out after addition of a 1M file in submodule
    TEST22: --depth=1 fetching+checking out after removal of a 1M file in submodule
    TEST23: --depth=1 fetching+checking out after addition then removal of a 1M file in submodule 

    TEST24: --depth=1 fetching+merging -X theirs with reflog and gc after addition of a 1M file
    TEST25: --depth=1 fetching+merging -X theirs with reflog and gc after removal of a 1M file
    TEST26: --depth=1 fetching+merging -X theirs with reflog and gc after addition then removal of a 1M file

    TEST27: --depth=1 fetching+merging -s ours with reflog and gc after addition of a 1M file
    TEST28: --depth=1 fetching+merging -s ours with reflog and gc after removal of a 1M file
    TEST29: --depth=1 fetching+merging -s ours with reflog and gc after addition then removal of a 1M file"

To go further

To learn more about the process of testing different git methods conducted, please refer to doc/developement_explanations.md.