The Week in Chess

Thursday, January 30, 2014

Stockfish 14012918 x64 vs. Stockfish 14011818 x64 - 100 Rounds, Regression Test

Stockfish 14012918 was released January 29, 2014 by Marco Costalba, Joona Kiiski and Tord Romstad. 

I decided to try once again to test it for regression and to see if it is suitable for the  rating list production with 100 round gauntlet matches.

Stockfish 14011818 was the target opponent which was the same as the previous one. The time control chosen was 20 seconds base + 300 milliseconds increment for quick result and to observe whether this time control is good for testing chess engines strength.

As the result showed below, the latest release Stockfish 14012918 won the match with 56-44, a 12 point lead where the previous release won only with 2 points margin. The match is too short and too fast which that is why the result should not be taken seriously, but for purposes of quick assessment,  it should be sufficient.  I will have to do some more longer tests to decide if it will be time to let it face the gauntlet.

Stockfish_14012918_x64 vs. Stockfish_14011818_x64 - Match 100R 20S+300ms
RankEngineScoreStStS-B
1Stockfish_14012918_x6456.0/100· ·· ·· ··23-11-66 2464.00 
2Stockfish_14011818_x6444.0/10011-23-66· ·· ·· ·· 2464.00 


100 games played / Tournament finished

Tournament start: 2014.01.30, 16:22:32
Latest update: 2014.01.30, 19:10:28
Level: Blitz 0:20/0.3
Hardware: AMD Phenom(tm) II X4 945 Processor with 4 GB Memory
Operating system: Windows 7 Ultimate Professional Service Pack 1 (Build 7601) 64 bit
Table created with: Arena 3.5

Download the match games in PGN here.

Houdini 4 Standard beat Houdini 4 Pro

There were some posts in the chess forums that Houdini 4 Standard is stronger than Houdini 4 Pro.  I was intrigued because Pro is supposed to be superior than Standard by common sense. So, I set out to explore what are the benefits of the Houdini 4 Standard version.  There were 2 64-bit versions of Houdini which are suffixed with "A" and "B". The A version was supposed to run on very old computers, while the "B" versions was for modern computers.  I choose the "A" version to match with my dinosauric computers. The computers used in the test were old that were bought around 3-5 years ago with 4 gigabytes of RAM. 

The first test was on the AMD Quad Core computer which pitted the Houdini 4 Standard against the Pro version with 100 rounds in 1 minute + 1 second time control. The result was  a win for Houdini standard with a score of 52.5-47.5 which is 5 points advantage.

The second test was on an older Intel Dual core computer  also with 100 rounds but with faster time control of 20 seconds base+ 300 milliseconds increment.  The result was an impressive win for Houdini Standard with  a score of 80-20. The Houdini Standard against Pro matches showed that it is stronger at least on my older computers. This is good news for chess engine testers who does not have modern computers. Probably, if the match was done in the latest computers, the scores might be different in favor of Houdini Pro.

So now we see the reason why there were so many versions of Houdini 4.  They all have specific strength to match in certain type of computers and configuration.  However, knowing which one is the best for a certain computer may take time and money. A rule of thumb maybe is to buy the Standard version when the existing computer model is older than 3 years and the Pro version when the computer is not yet 3 years old in relation to the CPU manufacturers last CPU model.

I also made a test match between Houdini 4 Standard against Stockfish 140127 to see if Stockfish can hold.  The time control chosen was 1 minute + 1 second to give them longer time for thinking. And the result... Stockfish 140127 won with a score of 54.5-45.5, a nine points margin. The difference is narrow compared to the last match between Stockfish 140127 and Houdini 4 Pro.

Here are the result of the matches:

I - AMD QUAD CORE

Houdini 4 Pro x64 vs. Houdini 4 Std x64 - Match 100R 1M1S
RankEngineScoreHoHoS-B
1Houdini 4 Pro x6452.5/100· ·· ·· ··21-16-63 2493.75 
2Houdini 4 Std x6447.5/10016-21-63· ·· ·· ·· 2493.75 


100 games played / Tournament finished

Tournament start: 2014.01.30, 00:21:14
Latest update: 2014.01.30, 10:08:17
Level: Blitz 1/1
Hardware: AMD Phenom(tm) II X4 945 Processor with 4 GB Memory
Operating system: Windows 7 Ultimate Professional Service Pack 1 (Build 7601) 64 bit
Table created with: Arena 3.5


II - INTEL DUAL CORE

Houdini 4 Std x64 vs. Houdini 4 Pro x64 - Match 100R 20S+300MS
RankEngineScoreHoHoS-B
1Houdini 4 Std x6480.0/100· ·· ·· ··71-11-18 1600.00 
2Houdini 4 Pro x6420.0/10011-71-18· ·· ·· ·· 1600.00 


100 games played / Tournament finished

Tournament start: 2014.01.29, 17:37:08
Latest update: 2014.01.29, 21:26:52
Level: Blitz 0:20/0.3
Hardware: Intel(R) Core(TM)2 CPU 4300 @ 1.8 GHz with 3.9 GB Memory
Operating system: Windows 7 Ultimate Professional Service Pack 1 (Build 7601) 64 bit
Table created with: Arena 3.5


Stockfish 140127 x64 vs. Houdini 4 Std x64 - Match 100R 1M1S
RankEngineScoreStHoS-B
1Stockfish 140127 x6454.5/100· ·· ·· ··35-26-39 2479.75 
2Houdini 4 Std x64 45.5/10026-35-39· ·· ·· ·· 2479.75 


100 games played / Tournament finished

Tournament start: 2014.01.29, 17:35:29
Latest update: 2014.01.30, 18:54:47
Level: Blitz 1/1
Hardware: Intel(R) Core(TM)2 CPU 4300 @ 1.80GHz with 3.9 GB Memory
Operating system: Windows 7 Ultimate Professional Service Pack 1 (Build 7601) 64 bit
Table created with: Arena 3.5


 Download the computer chess engines matches here.

Wednesday, January 29, 2014

Firenzina 2.4.1 xTreme x64 vs. Firenzina 2.4 xTreme x64 - Test Match, 100 Rounds

Firenzina 2.4.1 xTreme is a UCI chess derived from Fire 2.2 by Dimitri Gusev released last January 26, 2014.

Firenzina 2.4.1 lost to its older sibling Firenzina 2.4 by 7 points in the 100 round match at 1 minute + 1 second time control.  Therefore, it will not undergo the full gauntlet matches against the top strongest engines.

The chess engine version used in the match was a personal compilation because the only version supplied by the author will not work in my dinosauric computers made at Jurassic Park. The released 64-bit version was designed for modern computers with popcnt support, SSE4 and compiled with Intel C++ Compiler.  There is no 32-bit version and the attempt to compile one failed because the source contained a function that was undefined.  Also, the first 64-bit compile failed due to error which was caused by the badly coded InitCFG function which was remedied by commenting out the #DEFINE InitCFG code. Perhaps, many chess engine testers are alienated by the incomplete package and its requirement for modern computers.

Here is the result of the test:

Firenzina 2.4.1 xTreme x64 vs. Firenzina 2.4 xTreme x64 - Match 100R 1M1S
RankEngineScoreFiFiS-B
1Firenzina 2.4 xTreme x64 53.5/100· ·· ·· ·16-9-75 2487.75 
2Firenzina 2.4.1 xTreme x6446.5/1009-16-75· ·· ·· · 2487.75 


100 games played / Tournament finished

Tournament start: 2014.01.28, 23:23:55
Latest update: 2014.01.29, 17:16:48
Level: Blitz 1/1
Hardware: AMD Phenom(tm) II X4 945 Processor with 4 GB Memory
Operating system: Windows 7 Ultimate Professional Service Pack 1 (Build 7601) 64 bit

Table created with: Arena 3.5.
Download the computer chess engines tournament games here.

Owl Computer Chess Engines Rating List #90

The 90th Owl Computer Chess Engines Rating List released, 01/29/2014. This featured the gauntlet matches of DiscoCheck 5.2 x64 and SmarThink 1.50 x64.

View the full rating list here.

DiscoCheck 5.2 x64 - Gauntlet Matches, 100 Rounds

DiscoCheck 5.2 x64 by Lucas Braesch is a UCI chess engines released last January 10, 2014

DiscoCheck scord 57.45% with 395 wins, 246 losses and 359 draws against the 10 selected computer chess engines.  It posted 2775 ELO rating points and number 44 in the Top 100 Chess Engines Rating List.

Here is the statistics of DiscoCheck:
Rank Engine ELO Raw Games Score% Points Win Loss Draw Chg TF% Ply
1 Protector 1.5.0 x64 2868 127 100 62.50 62.5 44 19 37 -2 0.00 0
2 Minko Chess 1.3 x64 2724 57 100 52.00 52.0 30 26 44 10 0.00 0
3 DiscoCheck 5.2 x64 2775 45 1000 57.45 574.5 395 246 359 2775 0.10 238
4 Thinker 54D Inert x64 2753 42 100 49.50 49.5 28 29 43 2 0.00 0
5 Booot 5.2.0 x64 2735 26 100 47.00 47.0 30 36 34 5 0.00 0
6 Quazar 0.4 x64 2752 -21 100 39.50 39.5 18 39 43 -4 0.00 0
7 Umko 1.2 x64 2670 -26 100 39.50 39.5 25 46 29 7 0.00 0
8 Smaug 2.2.1 x64 2653 -37 100 37.50 37.5 19 44 37 10 0.00 0
9 Crafty 23.8 x64 2694 -53 100 35.50 35.5 21 50 29 -6 0.00 0
10 Nemo 1.0.1b x64 2684 -71 100 33.00 33.0 18 52 30 4 0.00 0
11 Tornado 4.88 x64 2645 -91 100 29.50 29.5 13 54 33 5 0.00 0
.
Download the computer chess engines tournament games here.

SmarThink 1.50 x64 - Gauntlet Matches, 100 Rounds

SmarThink 1.50 by Sergei Markoff is a UCI chess engine released last January 1, 2014.

SmarThink scored 47.6% with 329 wins 377 losses and 294 draws against the 10 selected chess engines.  It registered a 2677 ELO rating and number 56 in the Top 100 Chess Engines Rating List.

Here is the performance stats of Smarthink:
Rank Engine ELO Raw Games Score% Points Win Loss Draw Chg TF% Ply
1 DiscoCheck 5.2 x64 2775 104 100 66.50 66.5 56 23 21 2775 0.00 0
2 EXchess 7.18b x64 2669 14 100 54.50 54.5 37 28 35 8 0.00 0
3 Booot 5.2.0 x64 2735 13 100 54.00 54.0 41 33 26 5 0.00 0
4 Nemo 1.0.1b x64 2684 12 100 54.50 54.5 38 29 33 4 0.00 0
5 Minko Chess 1.3 x64 2724 6 100 53.50 53.5 33 26 41 10 0.00 0
6 Cheng 4.0.36a x64 2701 3 100 52.50 52.5 42 37 21 5 0.00 0
7 SmarThink 1.50 x64 2677 -15 1000 47.60 476.0 329 377 294 2677 0.20 78
8 Smaug 2.2.1 x64 2653 -30 100 48.00 48.0 32 36 32 10 0.00 0
9 Umko 1.2 x64 2670 -30 100 47.50 47.5 32 37 31 7 0.00 0
10 Tornado 4.88 x64 2645 -31 100 48.00 48.0 36 40 24 5 0.00 0
11 Crafty 23.8 x64 2694 -47 100 45.00 45.0 30 40 30 -6 0.00 0
.
Download the computer chess engines tournament games here.

Tuesday, January 28, 2014

Stockfish 14011818 x64 vs. Stockfish 14012720 x64 - 100 Rounds Regression Test

This is an exploratory match to determine whether there is a regression since Stockfish version 14011818.  There was blog that Marco Costalba advised a chess engines tester to use this version instead of the latest development build of Stockfish chess engine because of regression.

Well, the results in this fast blitz at 30 seconds base + 500 milliseconds increment showed that the latest build 14012720 released on January 27 showed 2 points margin over the older build released on January 18.  There were about 4 minor patches since January 18 which should give some ELO increase but the result is statistically irrelevant. There may not be significant improvement of the latest build over the supposedly stronger older build.

Here is the result:

Stockfish_14011818_x64 vs. Stockfish_14012720_x64 - Match 100R 30S+.5S
RankEngineScoreStStS-B
1Stockfish_14012720_x6451.0/100· ·· ·· ··18-16-66 2499.00 
2Stockfish_14011818_x6449.0/10016-18-66· ·· ·· ·· 2499.00 


100 games played / Tournament finished

Tournament start: 2014.01.28, 14:53:57
Latest update: 2014.01.28, 19:17:16
Level: Blitz 0:30/0.5
Hardware: AMD Phenom(tm) II X4 945 Processor with 1.8 GB Memory
Operating system: Windows 7 Ultimate Professional Service Pack 1 (Build 7601) 64 bit
Table created with: Arena 3.5
Download the chess match games here.

Gull 2.8beta vs. Gull 2.9alpha vs. Gull R600 - Test Matches

Gull 2.8 beta and Gull 2.9 alpha has just recently been released by Vadim Demichev last January 26, 2014. According to the author, Gull 2.8 beta is more or less the same strength as Gull R900 and Gull 2.9 alpha is development version that will eventually be 3.0 which may see action in the coming TCEC tournament. The evaluation function in Gull 2.8b is just copy pasted from 2.9a while Gull 2.9 alpha has no multi-processor support.

For curiosity's sake, I made a 100 round mini tournament for the 3 Gull versions at 30 seconds base + 500 milliseconds increment. The result showed  what should be the likely outcome as described by the author. Gull 2.8b won the tournament with just a negligible margin over Gull R600, while Gull 2.9a placed last, far from away from the 2 siblings which may be due to its lack of multi-processor capability.

The results are displayed below:

Gull 2.8b x64 vs Gull 2.8a x64 vs. Gull R600 - Test Match 100R 30S+500MS
RankEngineScoreGuGuGuS-B
1Gull 2.8b x64109.5/200· ·· ·· ··30-27-4327-11-62 10353.50 
2Gull R600 x64107.0/20027-30-43· ·· ·· ··38-21-41 10195.50 
3Gull 2.9a x6483.5/20011-27-6221-38-41· ·· ·· ·· 9039.50 


300 games played / Tournament finished

Tournament start: 2014.01.27, 15:01:47
Latest update: 2014.01.28, 04:25:19
Level: Blitz 0:30/0.5
Hardware: AMD Phenom(tm) II X4 945 Processor with 4 GB Memory
Operating system: Windows 7 Ultimate Professional Service Pack 1 (Build 7601) 64 bit
Table created with: Arena 3.5



(Note for Arena Team:  There might be a bug in the recording of PGN as the time control of 30 seconds + 500 milliseconds is recorded as [TimeControl "30+0"])

Download the chess match games here.

Monday, January 27, 2014

Stockfish 140119 x64 vs. Komodo TCEC x64 - Match 40 rounds, 30Min + 10Sec

Will Komodo win again in the coming TCEC Chess Engines Tournament? ... That will be finally answered when that time comes.

In the meantime, I assess in an unscientific way whether Stockfish has a chance against Komodo in longer time control.  Komodo won just by 2 points against Stockfish in the recent finals with all the the 48 games limit played. Stockfish gained 30 ELO points since the last DD version that competed in the finals and it is probably sufficient to edge Komodo in this moderately long time control match.

The match was arranged using the following conditions:
  - Computer: AMD dual core
  - RAM: 4GB
  - Operating system: Windows 7
  - Time control: 30 minutes base + 10 seconds increment
  - GUI: Arena Chess GUI 3.5
  - Number of cores used: 2
  - Hash table: 128MB

After 3 days of continuous intense chessboard battle, Stockfish 140119 emerged victorious against Komodo TCEC with +12-5=23 or a 7 point margin which may be impossible to be overtaken by Komodo even if the match is extended to 48 rounds.

Here is the official result of the match:

Stockfish 140119 x64 vs. Komodo TCEC x64 - Match 40R 30M10S
RankEngineScoreStKoS-B
1Stockfish 140119 x6423.5/40· ·· ·· ·12-5-23 387.75 
2Komodo TCEC x64 16.5/405-12-23· ·· ·· · 387.75 


40 of 40 games played

Tournament start: 2014.01.24, 22:04:09
Latest update: 2014.01.27, 22:42:14
Level: Blitz 30/10
Hardware: AMD Athlon(tm) II X2 250 Processor with 4 GB Memory
Operating system: Windows 7 Ultimate Professional Service Pack 1 (Build 7601) 64 bit
Table created with: Arena 3.5
Download the match games here.

Friday, January 24, 2014

Stockfish 140119 vs. Komodo TCEC - Match 5Min + 3Secs

While waiting for the the next TCEC tournament coming in a few weeks, I just made use of the idle time to test whether Komodo TCEC does better in longer time control of 5 minutes + 3 seconds compared to the 1 minute base + 1 second limit in my normal rating list production.
The time control chosen was suggested by Larry Kaufman to be a better one for chess engines to "think" properly.

The result of the match showed that Komodo TCEC scored 41 points from 100 games which is 3 points better than the last match. It may just be another random number but it also confirmed that Stockfish is stronger than Komodo at least in blitz time control.

Here is the match statistics:

Stockfish 140119 x64 vs. Komodo TCEC x64 - Match 100R 5M3S
RankEngineScoreStKoS-B
1Stockfish 140119 x6459.0/100· ·· ·· ··34-16-50 2419.00 
2Komodo TCEC x64 41.0/10016-34-50· ·· ·· ·· 2419.00 


100 games played / Tournament finished

Tournament start: 2014.01.23, 16:25:06
Latest update: 2014.01.24, 21:17:29
Level: Blitz 5/3
Hardware: AMD Athlon(tm) II X2 250 Processor with 4 GB Memory
Operating system: Windows 7 Ultimate Professional Service Pack 1 (Build 7601) 64 bit
Table created with: Arena 3.5

Download the match games in PGN here.

Wednesday, January 22, 2014

GrandMonsters SuperBlitz Tournament

Immediately after the Stockfish 140119 x64 Gauntlet Matches, I conducted a follow-up test tournament with the twin purpose of determining whether the latest Arena GUI 3.5 software can reliably handle a tournament with sub-second moves and to know whether Houdini is really the King in Blitz chess engine tournaments at present.

The previous Arena GUI 3.0 version was not very good at very fast time controls below 1 second per move with so many losses by time forfeit. This probably prevented other testers from using it seriously in producing rating lists at super blitz.

Houdini is very well known as the number one chess engine in most rating list websites and has the reputation to be the best in very short time controls. However, Houdini's reputation was blemished when it was not a finalist at the recent TCEC Live Chess Engines Tournament, and some testers reported in the forums that Houdini is no longer the number one in blitz tournaments including this tester.

To make the tournament fair, I selected the other 2 top chess engines like Stockfish and Komodo to battle against Houdini in the competition.  Arena GUI 3.5 was chosen for one of the reasons above which feature a common setting for the number of CPU cores and the Hash table memory size.  The computer used was a generic build with Dual Core AMD processor and 4 GB of memory. 128MB of memory was allocated for the hash table. The time control was 25 seconds base + 250 milliseconds increment. The 3 chess engines face each other in a round robin of 100 games each. There were no other major programs running simultaneously and there is only one instance of the tournament so all the computer resources are available for the chess engines competition.

The tournament was started just before I went to sleep in the early morning, and when I woke up I had these results:

SuperBlitz Tournament - 25S+250MS
RankEngineScoreStHoKoS-B
1Stockfish 140119 x64119.5/200· ·· ·· ··39-23-3847-24-29 10728.00 
2Houdini 4 Pro x64 106.5/20023-39-38· ·· ·· ··47-18-35 9792.00 
3Komodo TCEC x64 74.0/20024-47-2918-47-35· ·· ·· ·· 8381.50 


300 games played / Tournament finished

Tournament start: 2014.01.22, 01:15:38
Latest update: 2014.01.22, 10:43:48
Site/ Country: CYBIRD-PC2, United States
Level: Blitz 0:25/0.25
Hardware: AMD Athlon(tm) 64 X2 Dual Core Processor 5000+ with 1.8 GB Memory
Operating system: Windows 7 Ultimate Professional Service Pack 1 (Build 7601) 64 bit
Table created with: Arena 3.5

The result table above mirrors the ranking of the finishers with the latest rating list.  Also, the scores are very close to the score of the last gauntlet matches of 1 minute base + 1 second increment. Stockfish was the clear winner with +39-23=38 score against Houdini and +47-24=29 score against Komodo. Houdini won against Komodo with +47-18=35.

As for the reliability of Arena GUI 3.5, the resulting PGN file showed only 1 loss by time forfeit which was caused by Houdini. It is well known that Houdini has a time management problem as posted in many forums  and even at the TCEC tournament where it crashed. I can say with delight that Arena GUI 3.5 is now very usable in  the ultra fast tournament time controls. What is nice about this is that I was able to watch the tournament comfortably following the moves, unlike Little Blitzer or CuteChess which show only the statistics but not the the actual game in progress.

The choice of fast or slow time controls is irrelevant in the production of rating  list because the ranking of the chess engines are always in the proper place with minor deviation which can be observed by comparing the rating list of sites with different time controls. Fast time controls are nice to produce quick results, while long time controls are good to produce quality games in PGN.

Conclusion:
Arena GUI 3.5 is now very good for fast time controls while Houdini's dominance in the number 1 position in fast blitz time control is over.

Download the test tournament results here.

Tuesday, January 21, 2014

Stockfish 140119 x64 - Gauntlet Matches, 100 Rounds

Stockfish 140119 x64 is a UCI chess engine by +Marco Costalba, +Joona Kiiski  and +Tord Romstad compiled from the development version released last on January 19, 2014.

Stockfish 140119 scored 73.5% with 1138 wins, 151 losses and 811 draws against the top 21 strongest computer chess engines. It defeated all the strongest chess engines including the mighty Houdini 4 Pro and Komodo TCEC, the champion of the recent TCEC Live chess engines tournament. Stockfish registered an ELO of 3237 with 29 points increase from the last official DD version and retained the number 1 position of the Owl Computer Chess Engines Rating List. Stockfish is currently ahead by 52 ELO rating points from the second ranked chess engine, Houdini 4 Pro.

The tournament was conducted using the latest Arena GUI 3.5 version where the pattern showed that there is lesser number of games lost by time forfeit, unlike from the older 3.0 version. Also worth mentioning is that Strelka 5.7 do not have a single loss by time forfeit which is amazing considering that it lost by as much as 50% of the games from the previous Arena version when the Multi-processor feature is used. In this tournament, Strelka was using the full power of its Multi-processor capabilities.

Here is the gauntlet statistics of Stockfish 140119:


Rank Engine ELO Raw Games Score% Points Win Loss Draw Chg TF% Ply
1 Stockfish 140119 x64 3237 146 2100 73.50 1543.5 1138 151 811 3237 0.10 210
2 Houdini 4 Pro x64 3185 97 100 41.50 41.5 18 35 47 2 0.00 0
3 Komodo TCEC x64 3163 71 100 38.00 38.0 17 41 42 0 1.00 167
4 Samsung x64 3105 40 100 32.50 32.5 10 45 45 2 0.00 0
5 Robodini 1.1 x64 3094 30 100 31.00 31.0 9 47 44 0 0.00 0
6 ComStock 3 x64 3070 6 100 27.00 27.0 4 50 46 0 0.00 0
7 Bouquet 1.8 x64 3067 4 100 27.00 27.0 5 51 44 -1 0.00 0
8 Fire 3.0 x64 3068 4 100 26.00 26.0 1 49 50 -1 0.00 0
9 Gull R600 x64 3104 0 100 27.00 27.0 7 53 40 -2 1.00 116
10 PanChess 00.537 x64 3052 -3 100 27.00 27.0 8 54 38 0 0.00 0
11 RyanFish 1 x64 3068 -4 100 27.00 27.0 9 55 36 0 0.00 0
12 Firenzina 2.4 xTreme x64 3057 -8 100 25.50 25.5 4 53 43 1 0.00 0
13 Critter 1.6a x64 3092 -9 100 27.00 27.0 11 57 32 0 1.00 176
14 Tactico Power 2011 x64 3028 -14 100 25.00 25.0 5 55 40 1 0.00 0
15 RobboLito 0.21Q x64 3045 -25 100 23.50 23.5 4 57 39 0 0.00 0
16 Igorrit 0.086v9 x64 3047 -26 100 25.00 25.0 10 60 30 1 0.00 0
17 Strelka 5.7 x64 3081 -28 100 23.50 23.5 5 58 37 0 0.00 0
18 Rybka 4.1 x64 3028 -34 100 23.00 23.0 6 60 34 -1 0.00 0
19 Ivanhoe 46h x64 3035 -37 100 22.50 22.5 5 60 35 1 5.00 83
20 Saros 4.1.6 x64 3039 -59 100 20.00 20.0 4 64 32 -3 0.00 0
21 Mars 1 x64 3060 -59 100 20.00 20.0 4 64 32 0 0.00 0
22 LEOpard 0.7c x64 3025 -93 100 17.50 17.5 5 70 25 -1 0.00 0
.
Download the computer chess engines tournament games here.

Chessdom News