git_update/test/performance_tests
2024-07-23 21:34:37 +02:00
..
readme initialization 2024-07-23 21:34:37 +02:00

This file contains invisible Unicode characters

This file contains invisible Unicode characters that are indistinguishable to humans but may be processed differently by a computer. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

## Testing of different cloning methods 
Our objective is to find the least-consuming method in terms of memory and bandwidth resources.
We are interested in cloning one specific state of the repository. We are not interested in its history or the possibility to change it from the server where it has been cloned. The first step is done by the script creation_repo.sh which creates an adequate repository to act as a local remote. The testing in itself is done by performance_tests.sh.

## Creation of the test repository
The script creation_repo.sh creates a remote/performance_testing repository. 
NAME
    creation_repo.sh
SYNOPSIS
    creation_repo.sh [-h] [-s]
DESCRIPTION
        This script creates a ./remote directory in the current directory, then creates a remote/performance_testing git repository. 
        This git repository is filled with randomly generated binary files described in the readme.md.
OPTIONS
    -h prints the help. 
    -s creates a submodule remote/submodule_for_performance_testing and includes it in remote/performance_testing. "

Here is a history of the commits:
### branch main 			
commit  sample0 1M created (tag: start)
commit  sample1 1M created
commit  sample3 1M created
commit  sample4 5M created
commit  sample4 5M deleted
[if  -s is selected] commit adding submodule_for_performance_testing module
### branch secondary
commit  sample0 1M created 
commit  sample1 1M created
commit  sample2 500K created

Which gives the latest state:
### branch main
sample0	1M
sample1	1M
sample3	1M
### branch secondary
sample0 1M
sample1 1M
sample2 500K

If the -s option is selected, submodule_for_performance_testing is created with this history of commits:
### branch main
commit  first 1M sample created

which gives the latest state:
### branch main
sub_sample0

## Testing
The script performance_tests.sh measures memory and bandwidth usage for different git commands. It sources the script driglibash-base. It executes creation_repo.sh in order to have an adequate repository to test on. 
NAME
    performance_tests.sh
SYNOPSIS
    performance_tests.sh [-a] [-h] [-n number]
OPTIONS
    -a excutes all the tests.
    -n number executes test number
    -h prints the help. 
DESCRIPTION
    This script allows you to measure memory and bandwidth usage. The first five test different cloning methods. Te following apply changes to the local remote before testing fetching and merging commands.
    TEST0: classic cloning 
    TEST1: --single-branch cloning
    TEST2: --depth=1 --no-single-branch cloning
    TEST3: --depth=1 cloning
    TEST4: --depth=1 with reflog and gc cloning
    TEST5: sparse-checking 1M sample0 cloning
    _________________
    TEST6: classic fetching+checking out after addition of a 1M file
    TEST7: classic fetching+checking out after removal of a 1M file
    TEST8: classic fetching+checking out after addition then removal of a 1M file

    TEST9: --depth=1 fetching+checking out after addition of a 1M file
    TEST10: --depth=1 fetching+checking out after removal of a 1M file
    TEST11: --depth=1 fetching+checking out after addition then removal of 1M a file

    TEST12: --depth=1 fetching+checking out with reflog annd gc after addition of a 1M file
    TEST13: --depth=1 fetching+checking out with reflog annd gc after removal of a 1M file
    TEST14: --depth=1 fetching+checking out with reflog annd gc after addition then removal of a 1M file

    TEST15: --depth=1 fetching+ --reset=hard after addition of a 1M file
    TEST16: --depth=1 fetching+ --reset=hard after removal of a 1M file
    TEST17:  --depth=1 fetching+ --reset=hard after addition then removal of a 1M file

    TEST18: --depth=1 fetching+ --reset=hard and reflog and gc after addition of a 1M file
    TEST19: --depth=1 fetching+ --reset=hard and reflog and gc after removal of a 1M file
    TEST20: --depth=1 fetching+ --reset=hard and reflog and gc after addition then removal of a 1M file

    TEST21: --depth=1 fetching+checking out after addition of a 1M file in submodule
    TEST22: --depth=1 fetching+checking out after removal of a 1M file in submodule
    TEST23: --depth=1 fetching+checking out after addition then removal of a 1M file in submodule 

    TEST24: --depth=1 fetching+merging -X theirs with reflog and gc after addition of a 1M file
    TEST25: --depth=1 fetching+merging -X theirs with reflog and gc after removal of a 1M file
    TEST26: --depth=1 fetching+merging -X theirs with reflog and gc after addition then removal of a 1M file

    TEST27: --depth=1 fetching+merging -s ours with reflog and gc after addition of a 1M file
    TEST28: --depth=1 fetching+merging -s ours with reflog and gc after removal of a 1M file
    TEST29: --depth=1 fetching+merging -s ours with reflog and gc after addition then removal of a 1M file"