Computers have progressed a very, very long way since their earliest days. It may not be quite as well known, though, that computers have been programmed to play games since almost the very beginning. But we doubt that the hardy coding pioneers of the time would have dreamed just how far the state of the art has come since then.
Certainly well known today are Google's phenomenal Alpha game playing programs, which contain self-teaching or "machine learning" methods. After years and years of computer Go programs barely reaching respectable playing levels, AlphaGo appeared on the scene and defeated one of the world's highest ranked Go players, something no one ever expected. And AlphaChess very quickly became able to defeat even the strongest chess playing programs around. Machine learning is here to stay, and the results are phenomenal.
Of course, it's not new. The idea was proposed by an IBM scientist over 60 years ago. But implementing it successfully was the issue, for to succeed, the computer programs would have to "train" on millions and millions of different game positions. That wasn't a realistic possibility until relatively recent years.
The method worked well at first for non-deterministic games--- games with an element of luck. Gnu Backgammon played at the master level, as did others. But applications to checkers largely failed. Blondie 24 was a lot of fun but never a serious competitor, and NeuroDraughts wasn't fully developed.
All that has changed, though, with renowned checker engine programmer Ed Gilbert's latest developments for his world class Kingsrow computer engine. Ed was kind enough to send us the details. The following was written by Ed Gilbert with input from Rein Halbersma.
Until recently, Kingsrow used a manually built and tuned evaluation function. This function computes a numeric score for a game position based on a number of material and positional features. It looks at the number of men and kings of each color, and position attributes including back rank formation, center control, tempo, left-right balance, runaways (men that have an open path to crowning), locks, bridges, tailhooks, king mobility, dog-holes, and several others. Creating this function requires some knowledge of checkers strategy, and is very time consuming.
The latest Kingsrow has done away with these manually constructed and tuned evaluation features. Instead it is built using machine learning (ML) techniques which require no game specific knowledge other than the basic rules of the game. It has learned to play at a level significantly stronger than previous versions entirely through self-play games.
In a test match of 16,000 blitz games (11-man ballot, 0.3 seconds per move), it scored a +72 ELO advantage over the best manually built and tuned eval version. There were more than 5 times as many wins for the ML Kingsrow as losses.
The ML eval uses a set of overlapping rectangular board regions. These regions are either 8 or 12 squares, depending on whether kings are present. For every configuration of pieces on these squares, a score is assigned by the machine learning process. A position evaluation is then simply the sum of the scores of each region, plus something for any material differences in men and kings. In the 8-square regions, each square can either be empty or occupied by one of the four piece types, so there are total of 5^8 = 390,625 configurations. In the 12-square regions there are no kings, so there are 3^12 = 531,441 configurations.
To compute values for each configuration, a large number of training positions are needed. I created a database of approximately one million games through rapid self-play. Each game took about 5 seconds. The positions are extracted from the games, and each position is assigned the win, draw, or loss value of the game result. Initially the values in the rectangular board regions are assigned random values. Through a process called logistic regression, the values are adjusted to minimize the mean squared error when comparing the eval output of each training position to the win, draw, or loss value that was assigned from the game results.
Similar machine learning techniques have been used in other board game programs. In 1997, Michael Buro described a similar process that he used to build the evaluation function for his Othello program named Logistello. In 2015, Fabien Letouzey created a strong 10x10 international draughts program named Scan using an ML eval, and around this time Michel Grimminck was using a ML eval in his program Dragon. Since then other 10x10 programs have switched to ML evals, including Kingsrow, and Maximus by Jan-Jaap van Horssen. I think that the English and Italian variants of Kingsrow are the first 8x8 programs to use an ML eval.
Ed's new super-strong version of KingsRow is available for free download from his website. Combine that with his 10-piece endgame database, and you'll have by far the strongest checker engine in the world, a fearsome competitor and an incredible training partner.
Let's look at a few difficult positions, some of which were analyzed by human players for years and even by reasonably strong computer engines for hours. KingsRow ML solved each and every one of them virtually instantly.
First, the so-called "100 years problem" (as in Boland's Masterpiece, p. 125 diagram 1).
Next, the Phantom Fox Den, from Basic Checkers 2010, p. 260.
And finally, a position suggested by Richard Pask, from Complete Checkers p. 273 halfway down, where Mr. Pask notes: "12-16?! has shock value ..."
Surely we don't expect you to solve each of these (unless you wish to), but do look them over and at least form an opinion. Then click on Read More and be amazed.[Read More]
Many golf enthusiasts just can't get enough, and while we don't know how they do it, some of them golf even during the winter when snow covers the ground. They're out taking their "winter strokes" and loving every minute of it.
We have a different "winter stroke" to offer you. We don't know if snow is as yet on the ground where you live--- it might well be in some locations--- but we're pretty sure you've kept the snow off your checkerboard. Here's a stroke problem that will stretch your powers of visualization.
Solving these problems is par for the course; when you're done, click on Read More to score your solution.[Read More]
A few weeks back, we presented Ed Gilbert's review of checker apps for the iPhone. We promised a similar review of Android apps, and we're bringing you that review today.
The iPhone scene was frankly pretty bleak. Ed didn't find a single truly serious checker app for the popular Apple smartphone. We'll cut to the chase: for Android phones, the situation is somewhat better. There are two reasonable candidates for the "serious app" title and a whole host of "toy" apps.
As was the case with the iPhone, there is just too much detail for a single weekly column, so instead you'll find the full details on this separate web page.
The two apps that we think merit consideration are Checkers Tutor, by world class checker programmer Martin Fierz (author of CheckerBoard and the Cake computer engine), and Checkers for Android by programmer Aart Bik, who is best known for Chess for Android. Aart's checker app is free and Martin's app sells for just one dollar.
Interesting and detailed email interviews with both Martin and Aart can be found on the web page linked above.
So, which app should you install? Checkers Tutor plays a reasonable game, certainly well above the casual skill level. The same can be said of Checkers for Android. It's also safe to say that neither of these will provide a serious challenge to a master-level player.
Our usual benchmark is Martin Fierz's Simple Checkers engine. Both Checkers Tutor and Checkers for Android play somewhat better than Simple, at least based on the limited trial matchups conducted in our offices. But you won't get play at the level of Cake or KingsRow, either.
There's some irony here. Both Martin and Aart received a lot of user feedback saying that the engines are too strong and that the user can never win a game! This perhaps says more about the casual player than it does about the strength of the computer engines.
How do Checkers Tutor (CT) and Checkers for Android (CFA) compare with each other?
CT has hands-down the better graphics and display, and some additional play features, such as move take-back and replay, neither of which are present in CFA. Both programs lack a move list or a means of exporting moves. CT has optional square numbers; CFA allows compulsory jumping to be turned off, which you may consider a feature or an anti-feature, but Aart says users demanded it. CT allows for selection of a random 3-move ballot. CFA has a tiny endgame database which most notably will allow it to convert a two kings vs. one win, something a surprising number of programs can't do.
Neither app allows for position set-up, but neither engine is really strong enough to do meaningful analysis.
Which engine plays better checkers? That's a good question. We think CT has a definite edge, albeit not a large one. In the two head-to-head matches that we conducted, CT handily won one of them. CFA got a winning position in the other game but then couldn't figure out the ending and the result was a draw. Such limited testing is hardly decisive, of course.
Our bottom line is that there's little reason not to install both apps and make your own comparison. It will only cost you a dollar, after all, and you'll have a lot of fun ahead of you.
Several of the test games are on the companion page mentioned earlier. What we'll show you here is an excerpt from one of the games between CT and CFA. We'll stop at one of the game's critical points and let you take over.
The game was played on April 18, 2012 at the Hawai`i State Library. Checkers Tutor had Black and played at 15 seconds per move. Checkers for Android had White and played at 10 seconds per move (it wasn't possible to set the same timing for both).
An inferior move; 22-17 or 22-28 are better.
4-8 holds the edge.
A checker subtlety: 22-17 equalizes, while this does not.
4-8 would have maintained the lead.
This move loses; 30-26 is correct. CT now has a win on the board but won't find it. Will you?
Who's better, CT, CFA, or you? Match wits with the computer, then click on Read More to see the solution.[Read More]
The telephone above is most definitely not an iPhone, but it does seem to pretty well represent the state of the art when it comes to playing checkers on an iPhone.
This article is the first of two on smartphone checker apps. Ed Gilbert, author of the world-class KingsRow checker engine and companion 10-piece endgame database, has graciously given of his time in order to evaluate a group of checker apps for the iPhone. A future article will look into checker apps for Android-based phones.
Ed's article is extensive and includes large graphics, so it merits its own web page. You can find it here, but we'll give you Ed's bottom line right away: there's not much out there that has merit for the serious player. That's indeed regrettable, because from what Ed shows us, we can't help but conclude that the iPhone app authors could have done much better without a lot of additional effort. Unfortunately, most checker program authors think they are producing toys rather than serious game-playing programs, and that's just what they end up doing.
Let's look at a sample game that Ed ran between the iPhone checker apps "Teeny Checkers" and "Fantastic Checkers".
White now has a simple move that very likely leads to a win. Can you spot it?
We'll give you the answer and find out how the game progressed when you dial your mouse to Read More.[Read More]
Some of the problems and study material presented in our weekly columns are, to say the least, somewhat challenging for the average player. But we've always tried to a make a point of having something for everyone, so, once again, we're "taking a break" from the really hard stuff and presenting a problem that is interesting, practical, and not so difficult.
Black is a man up, but not for long. Still, there is a very nice win on the board. Can you see it? We'd rate this problem as no higher than "intermediate" in difficulty, and we're sure the more experienced players will call it "easy." Whatever it may be, can you solve it? The truly easy thing about it: clicking on Read More will lead you right to the solution.[Read More]
A few years ago, Ed Gilbert, the author of the KingsRow computer checkers engine, and the creator of the companion 10-piece endgame database, sent some new play to one of checker's sharpest-eyed analysts, Brian Hinkle. Ed told Brian the following:
"The 5-9 24-20 Double Cross is indeed a draw. This is exciting news to me, since this is a new, unknown draw in a ballot that is generally considered by a lot of players to be the most difficult of the 3-move tournament ballots. This morning I loaded the new opening book into Kingsrow and played along the PV. It dropped out of book at the 40th ply into a very interesting position where White had sacrificed a man to gain a first king with a positional advantage."
Ed showed the following line of play:
A---25-22 4-8 31-27 same.
B---This is the end of the computer's opening book moves. Note that Ed constructed a special opening book that examined the Double Cross in great depth and detail.
Ed comments further, "Every black move from 24-20 up to move 13 is forced."
Here is the position at the end of the KingsRow specialized Double Cross opening book.
Finding the rest of the solution is not an easy task, but you owe it to yourself to give it a try. The solution is not long, but it is very surprising, perhaps ranking among the most surprising things we've ever seen on the checkerboard. After you've done your analysis, click on Read More to see the truly stunning conclusion.[Read More]
The amusement park in Cedar Point, Ohio, isn't known as Checkersland, though it could well have been so named, instead of bearing the much less orignal title of Cedar Point Amusement Park. We're certain that it's a fine amusement park, though we doubt that it honors the history of Cedar Point as the home of a series of high-level championship checker tournaments some decades back. More's the pity; we could envision some sort of checker-themed roller coaster as a major attraction.
However, there really is a Checkersland; but it's a relatively new checker-playing computer program. In recent years we've come across very few new checker playing programs that were intended to be more than toys, but Checkersland certainly is a serious effort. It plays a very wide variety of checker games, running the gamut from American-British "straight" checkers, to Russian, to pool, and even Turkish and Sri Lankan and many more. The graphics are attractive and there are a number of useful features, such as reading PDN, position set-up, and the like. Best of all: Checkersland is coded in Java and so will run on Windows, Linux, Mac, and in fact on just about any computer that boasts a conformant Java implementation. The Checkersland web site can be found here.
We were more than anxious to try out this latest checker-playing effort, and pleased to see that it featured many levels of play, from "easy" right through "impossible." So we carried out our standard test, playing Checkersland at its highest level, "impossible," against Martin Fierz's Simple Checkers We gave Simple Checkers five seconds per move to match up with the time that Checkersland seemed to take on our laboratory test system, Konanekane, a dual-core 2.5 Ghz machine with 4GB of memory.
To make a long story short, our testing showed that Checkersland is a work in progress, at least with respect to "straight" checkers; even at its "impossible" level, it was handily defeated by Simple Checkers.
You can see the full game in animated form, with brief comments, by clicking here. But first, we'd like to show you a position from the game. Checkersland has just made a weak move, and now Simple Checkers, playing Black, can play a winning line.
Our challenge to you is to find the win for Black. It's certainly not at the "impossible" level, but it does require a bit of thought and follow-through. After you've done the possible, or found it not possible, click on Read More to see one possible winning line, or else go back and view the animation to see the solution in the context of the entire game.
We hope that the Russian author of Checkersland continues to work on his product, as we believe it has a lot of unrealized potential. And, by the way, Cedar Point Amusement Park take note: we'd still like some day to ride on a checker-themed roller coaster.[Read More]
Checkers has been solved. This headline has appeared in many news articles in the past few days, as Dr. Johnathan Schaeffer announced on July 19, 2007, that his team at University of Alberta has completed their quest to develop a rigid proof that checkers is, with perfect play on both sides, a drawn game.
This proof is a monumental achievement in computer science and took years of work to develop. The proof is non-trivial, both in complexity and quality; Dr. Schaeffer's team not only proved that checkers is a draw, they've shown how to do it. Given that you have a very powerful computer, an enormous database, and some sophisticated computer code, you can draw all your games from now on.
What's our reaction to this? As computer literates ourselves, we're most impressed with Dr. Schaeffer's work. As checker players and fans, we can only say, "Let's play checkers." In fact, this week your editor himself was quoted in Nature Magazine, when asked about the impact of this new proof on the checker playing public, as saying, "People will keep right on playing checkers."
And, of course, they will.
We note that, while checkers is now known definitively to be a draw, this coming week's U.S. National Tournament hasn't been canceled, and we in fact predict with great confidence that the tournament will produce some really fine games of checkers. Too, we have no plans to discontinue publishing The Checker Maven as we still think there are plenty of checker topics worthy of column space.
While the computer has had a profound influence on our game, mostly (but not entirely) for the good, checkers is in the end a game for people. Real, live people from all walks of life play checkers, and they won't stop now. From country picnic tables all the way to Las Vegas tournament halls, checkers will go on.
A draw? Yes, in theory, the game is a proven draw. In practice, it's still a great deal of fun and a tremendous intellectual challenge, and that's not going to change.
Editor's note: The photo at the top of this article is a file photo and is not of Dr. Schaeffer and his laboratory.
Here's an astronomy question for our checker fans:
Q. What's denser than the galactic core and has to do with checkers?
A. Ed Gilbert's 10-piece endgame database.
Completed in just under a year (admittedly somewhat faster than the Milky Way galaxy took to reach its present form), and containing the proverbial billions and billions of endgame positions, Ed Gilbert's 10-piece endgame database is dense, rich, and full of checker content. It's the very first product of its type to become available for home use by serious checker enthusiasts, and it is destined to change and enhance top-level checker play as leading players begin to use it for analysis and study.
The concepts involved seem complex on a nearly cosmological scale, so to bring things down to earth, The Checker Maven interviewed database creator Ed Gilbert by email. Here's what Ed has to say about the database and other topics of interest.
CM: Tell us a little bit about the 10 piece database.
Ed: It's a database which gives the exact win, draw, or loss value of every checkers position having from 2 to 10 pieces with up to 5 pieces on a side. Checker programs have used endgame databases for quite a while. What makes this one somewhat newsworthy is its size and the tremendous increase in playing strength that it gives to Kingsrow. Up until about 2001 the largest endgame database that was available to the public was for 6 pieces. At that time several commercial programs introduced 8 piece databases, and free programs like Cake and Kingsrow soon followed with 8 piece databases computed by Martin Fierz and Jonathan Schaeffer.
The number of checker positions that are possible increases rapidly with the number of pieces on the board, as does the size of these databases and the time it takes to compute them. A 6-piece database can be computed in a couple of hours, and easily fits on a 64mb flash drive. An 8-piece database takes a couple of weeks and occupies about 4GB. It is only within the last few years that anyone has been able to build a 10-piece database. Jonathan Schaeffer was able to complete the first 10-piece database in 2003 using the resources that were available to him as a department head at the University of Alberta. He estimates that it took him 15 computer years to build it. The Kingsrow 10-piece database is the first to be available to the general public.
CM: How long did it take you to develop the database? How did you go about it? What would you estimate you've invested in time and expense?
Ed: I started building it in August, 2004, and finished 11-1/2 months later. Before starting, I spent several months working on the program that would build it and optimizing its performance to run as quickly as possible. I wanted to build the database in one year, and I did not want to buy 15 computers to do it! I also had to work out some technical difficulties in building some of the very largest subdivisions. Schaeffer had available to him a powerful 64-processor Silicon Graphics workstation with 32gb of RAM. I had to work out schemes to build everything on my home PCs with only 2gb. I ended up building and using 4 computers to do most of the database computations. By building them myself I was able to save some money, and also get exactly what I needed, which was computers with lots of RAM and the largest hard drives available, but without any other extras normally purchased with a new PC. I didn't need keyboards, mice, speakers, or displays, since these machines were only used to compute databases 24 hours a day. There is a more detailed account of my building the database at my web site here:
It describes some of the computer science involved, as well as my experiences maintaining the equipment and keeping the whole thing running for a year.
CM: How do you think the database will influence checker play and analysis?
Ed: I think it will help identify some errors in published play, and it will allow many difficult endings that have up to now been left as 'probable loss' or 'probable draw' to be conclusively resolved. Many tournament players like to review their games and see if they missed a win, if their opponent missed a win, or if they lost, identify the exact losing move. The 10pc db is a tremendous help with this type of analysis. There is also a small group of analysts that like to try to find draws in absurdly difficult opening lines. I admit to doing some of this myself, although strictly using the computer, since my own crossboard analytical skills are not very strong. I've been maintaining a specialized 10-piece opening book in which I've accumulated new play in ballots like the Black Hole, Twilight Zone, Gemini, Wilderness, Double Cross, Octopus, etc. This opening book is available on the web here:
CM: Although we know that overall you're won't even come close to making a profit, nonetheless in absolute dollar terms the database is, at $160, a non-trivial investment. What do you think is the market?
Ed: I think the number of people that will purchase this is quite small, due to the declining population of serious checkers players. I will be surprised if the money made from selling 10 piece databases will cover the cost of even one of the computers that I used to build it. This is not a money making business! Selling the database allows me to get it to the people that really want it and can make good use of it. As far as the price goes, I think it's an incredible bargain. If you are serious about playing checkers, there is nothing else that comes close to giving you the information it provides. You cannot obtain one of these anywhere else in the world. If you wanted to build one yourself, assuming you had the skills to do it, you'd have to buy a bunch of computers and run them for a very long time. Then you'd have to figure out how to integrate it into a checkers program, which means you'd have to write your own checkers program...
CM: We understand that the database needs some hefty computing power for effective use. Can you tell us something about that?
Ed: A few years ago this was true, but by today's standards the requirements are no longer "hefty." Basically you need a sufficient amount of RAM to cache portions of the database during a search, and you need enough free hard drive space to store it. That means you need about 2gb of RAM and about 250gb of free disk space. There is a way you can use the database with less; you can read the details at the Kingsrow web site. If you're using an older machine that needs more of these things, it's likely you can add them quite inexpensively. A coworker at the office just bought 2gb of RAM for $75, and 320gb hard drives are selling for around that same amount.
CM: Whenever large leaps are taken in computer checkers, such as represented by your database, there is inevitable discussion of "solving" checkers once and for all. Indeed, a university team is working on that very thing. Do you think checkers will be "solved" and how do you think that might affect human play?
Ed: Apparently Schaeffer is getting close to proving the win/loss/draw value of all of the 3-move ballots. This will be a nice milestone in computer science and the culmination of perhaps 10 years of work by his team. I expect that they will simply confirm what people have already known about the ballots for a long time, and it will have very little if any effect on human play. However there always is a small possibility that they will find a draw in one of the ballots thought to lose. If this occurs then it will be big news for 3-move tournament players, and also for that other group I mentioned that likes to look for draws in absurdly difficult opening lines.
CM: You are a world-class checker programmer and database developer. Can you tell us a little about your background and interests? What do you do as a "day job" and what do you do for fun?
Ed: For about the last 27 years I've been working for what used to be called Hewlett-Packard, until they spun off their electronic instrument business as a separate company a few years ago, and now I work for that company, Agilent Technologies. I work in R&D as an engineer developing new products. I'm married, with 3 daughters, two of whom have finished college, while the youngest just turned 20 and has a couple of years left to go. I do a lot of bicycling on my road bike (on an Italian steel frame that I built into a complete bike about 15 years ago), and I've also been a runner and regular lap swimmer for over 35 years. When I was younger I used to race in running road races and triathlons, and now I keep up these activities simply for the enjoyment and the exercise.
CM: Can you tell our readers how they might decide if they want the database, and how they might obtain it if so?
Ed: If you're serious about the game of checkers, and you want to have the most powerful tool for analyzing positions that is available anywhere, then you want the 10-piece database. All the information that you need to obtain it, including PC requirements and installation instructions, are at the Kingsrow web site:
The Checker Maven thanks database developer Ed Gilbert for granting us this interview, and repeats Ed's statement that if you're a serious checker player, the 10-piece database is a true "must have" product. We're getting our own copy as soon as we possibly can.
Database Creator Ed Gilbert
Photo courtesy of Carol Gilbert
Ed Gilbert has set his KingsRow engine and 10 piece endgame database on the path of more deep discoveries as he continues to mine interesting problems. Here's a sample of some of the things Ed has unearthed:
See for yourself; you can download this special opening book, for use with KingsRow and CheckerBoard, from Ed's web site here.
Thanks to both Ed Gilbert and Brian Hinkle for providing us with input for this story.