We've written before about February being the shortest month of the year, when mortgage, rent, and other monthly payments remain unchanged despite covering fewer days. Well, this February is slightly different, as 2020 is a leap year and February gets an extra day. We suppose that helps at least a little bit.
When you have the solution let your mouse take the "almost shortest" path to Read More to verify your answer.[Read More]
We say it every year: Thanksgiving is our favorite holiday. It's typically American (and, of course, Canadian), it's relevant to every belief and creed, and it unites us all in giving thanks for the many blessings we have. No matter our station in life, we can all find something to be thankful for.
We like to turn to the great American checkerist and problemist Tom Wiswell for holidays such as these, and what better than a problem, arising from an actual game, that Mr. Wiswell called Dixie.
Mr. Wiswell noted, "We met with the well-known O. J. Tanner, who got us into the following predicament."
White may be a piece up but his options are severely limited by the two Black kings. Can you find your way out of this one? Make yourself a hot turkey sandwich (with mashed potatoes and gravy, of course) as you contemplate this one, and then--- when you've finished the sandwich and come up with a solution--- click on Read More to check your answer.
Computers have progressed a very, very long way since their earliest days. It may not be quite as well known, though, that computers have been programmed to play games since almost the very beginning. But we doubt that the hardy coding pioneers of the time would have dreamed just how far the state of the art has come since then.
Certainly well known today are Google's phenomenal Alpha game playing programs, which contain self-teaching or "machine learning" methods. After years and years of computer Go programs barely reaching respectable playing levels, AlphaGo appeared on the scene and defeated one of the world's highest ranked Go players, something no one ever expected. And AlphaChess very quickly became able to defeat even the strongest chess playing programs around. Machine learning is here to stay, and the results are phenomenal.
Of course, it's not new. The idea was proposed by an IBM scientist over 60 years ago. But implementing it successfully was the issue, for to succeed, the computer programs would have to "train" on millions and millions of different game positions. That wasn't a realistic possibility until relatively recent years.
The method worked well at first for non-deterministic games--- games with an element of luck. Gnu Backgammon played at the master level, as did others. But applications to checkers largely failed. Blondie 24 was a lot of fun but never a serious competitor, and NeuroDraughts wasn't fully developed.
All that has changed, though, with renowned checker engine programmer Ed Gilbert's latest developments for his world class Kingsrow computer engine. Ed was kind enough to send us the details. The following was written by Ed Gilbert with input from Rein Halbersma.
Until recently, Kingsrow used a manually built and tuned evaluation function. This function computes a numeric score for a game position based on a number of material and positional features. It looks at the number of men and kings of each color, and position attributes including back rank formation, center control, tempo, left-right balance, runaways (men that have an open path to crowning), locks, bridges, tailhooks, king mobility, dog-holes, and several others. Creating this function requires some knowledge of checkers strategy, and is very time consuming.
The latest Kingsrow has done away with these manually constructed and tuned evaluation features. Instead it is built using machine learning (ML) techniques which require no game specific knowledge other than the basic rules of the game. It has learned to play at a level significantly stronger than previous versions entirely through self-play games.
In a test match of 16,000 blitz games (11-man ballot, 0.3 seconds per move), it scored a +72 ELO advantage over the best manually built and tuned eval version. There were more than 5 times as many wins for the ML Kingsrow as losses.
The ML eval uses a set of overlapping rectangular board regions. These regions are either 8 or 12 squares, depending on whether kings are present. For every configuration of pieces on these squares, a score is assigned by the machine learning process. A position evaluation is then simply the sum of the scores of each region, plus something for any material differences in men and kings. In the 8-square regions, each square can either be empty or occupied by one of the four piece types, so there are total of 5^8 = 390,625 configurations. In the 12-square regions there are no kings, so there are 3^12 = 531,441 configurations.
To compute values for each configuration, a large number of training positions are needed. I created a database of approximately one million games through rapid self-play. Each game took about 5 seconds. The positions are extracted from the games, and each position is assigned the win, draw, or loss value of the game result. Initially the values in the rectangular board regions are assigned random values. Through a process called logistic regression, the values are adjusted to minimize the mean squared error when comparing the eval output of each training position to the win, draw, or loss value that was assigned from the game results.
Similar machine learning techniques have been used in other board game programs. In 1997, Michael Buro described a similar process that he used to build the evaluation function for his Othello program named Logistello. In 2015, Fabien Letouzey created a strong 10x10 international draughts program named Scan using an ML eval, and around this time Michel Grimminck was using a ML eval in his program Dragon. Since then other 10x10 programs have switched to ML evals, including Kingsrow, and Maximus by Jan-Jaap van Horssen. I think that the English and Italian variants of Kingsrow are the first 8x8 programs to use an ML eval.
Ed's new super-strong version of KingsRow is available for free download from his website. Combine that with his 10-piece endgame database, and you'll have by far the strongest checker engine in the world, a fearsome competitor and an incredible training partner.
Let's look at a few difficult positions, some of which were analyzed by human players for years and even by reasonably strong computer engines for hours. KingsRow ML solved each and every one of them virtually instantly.
First, the so-called "100 years problem" (as in Boland's Masterpiece, p. 125 diagram 1).
Next, the Phantom Fox Den, from Basic Checkers 2010, p. 260.
And finally, a position suggested by Richard Pask, from Complete Checkers p. 273 halfway down, where Mr. Pask notes: "12-16?! has shock value ..."
Surely we don't expect you to solve each of these (unless you wish to), but do look them over and at least form an opinion. Then click on Read More and be amazed.[Read More]
Many golf enthusiasts just can't get enough, and while we don't know how they do it, some of them golf even during the winter when snow covers the ground. They're out taking their "winter strokes" and loving every minute of it.
We have a different "winter stroke" to offer you. We don't know if snow is as yet on the ground where you live--- it might well be in some locations--- but we're pretty sure you've kept the snow off your checkerboard. Here's a stroke problem that will stretch your powers of visualization.
Solving these problems is par for the course; when you're done, click on Read More to score your solution.[Read More]
A few weeks back, we presented Ed Gilbert's review of checker apps for the iPhone. We promised a similar review of Android apps, and we're bringing you that review today.
The iPhone scene was frankly pretty bleak. Ed didn't find a single truly serious checker app for the popular Apple smartphone. We'll cut to the chase: for Android phones, the situation is somewhat better. There are two reasonable candidates for the "serious app" title and a whole host of "toy" apps.
As was the case with the iPhone, there is just too much detail for a single weekly column, so instead you'll find the full details on this separate web page.
The two apps that we think merit consideration are Checkers Tutor, by world class checker programmer Martin Fierz (author of CheckerBoard and the Cake computer engine), and Checkers for Android by programmer Aart Bik, who is best known for Chess for Android. Aart's checker app is free and Martin's app sells for just one dollar.
Interesting and detailed email interviews with both Martin and Aart can be found on the web page linked above.
So, which app should you install? Checkers Tutor plays a reasonable game, certainly well above the casual skill level. The same can be said of Checkers for Android. It's also safe to say that neither of these will provide a serious challenge to a master-level player.
Our usual benchmark is Martin Fierz's Simple Checkers engine. Both Checkers Tutor and Checkers for Android play somewhat better than Simple, at least based on the limited trial matchups conducted in our offices. But you won't get play at the level of Cake or KingsRow, either.
There's some irony here. Both Martin and Aart received a lot of user feedback saying that the engines are too strong and that the user can never win a game! This perhaps says more about the casual player than it does about the strength of the computer engines.
How do Checkers Tutor (CT) and Checkers for Android (CFA) compare with each other?
CT has hands-down the better graphics and display, and some additional play features, such as move take-back and replay, neither of which are present in CFA. Both programs lack a move list or a means of exporting moves. CT has optional square numbers; CFA allows compulsory jumping to be turned off, which you may consider a feature or an anti-feature, but Aart says users demanded it. CT allows for selection of a random 3-move ballot. CFA has a tiny endgame database which most notably will allow it to convert a two kings vs. one win, something a surprising number of programs can't do.
Neither app allows for position set-up, but neither engine is really strong enough to do meaningful analysis.
Which engine plays better checkers? That's a good question. We think CT has a definite edge, albeit not a large one. In the two head-to-head matches that we conducted, CT handily won one of them. CFA got a winning position in the other game but then couldn't figure out the ending and the result was a draw. Such limited testing is hardly decisive, of course.
Our bottom line is that there's little reason not to install both apps and make your own comparison. It will only cost you a dollar, after all, and you'll have a lot of fun ahead of you.
Several of the test games are on the companion page mentioned earlier. What we'll show you here is an excerpt from one of the games between CT and CFA. We'll stop at one of the game's critical points and let you take over.
The game was played on April 18, 2012 at the Hawai`i State Library. Checkers Tutor had Black and played at 15 seconds per move. Checkers for Android had White and played at 10 seconds per move (it wasn't possible to set the same timing for both).
An inferior move; 22-17 or 22-28 are better.
4-8 holds the edge.
A checker subtlety: 22-17 equalizes, while this does not.
4-8 would have maintained the lead.
This move loses; 30-26 is correct. CT now has a win on the board but won't find it. Will you?
Who's better, CT, CFA, or you? Match wits with the computer, then click on Read More to see the solution.[Read More]
The telephone above is most definitely not an iPhone, but it does seem to pretty well represent the state of the art when it comes to playing checkers on an iPhone.
This article is the first of two on smartphone checker apps. Ed Gilbert, author of the world-class KingsRow checker engine and companion 10-piece endgame database, has graciously given of his time in order to evaluate a group of checker apps for the iPhone. A future article will look into checker apps for Android-based phones.
Ed's article is extensive and includes large graphics, so it merits its own web page. You can find it here, but we'll give you Ed's bottom line right away: there's not much out there that has merit for the serious player. That's indeed regrettable, because from what Ed shows us, we can't help but conclude that the iPhone app authors could have done much better without a lot of additional effort. Unfortunately, most checker program authors think they are producing toys rather than serious game-playing programs, and that's just what they end up doing.
Let's look at a sample game that Ed ran between the iPhone checker apps "Teeny Checkers" and "Fantastic Checkers".
White now has a simple move that very likely leads to a win. Can you spot it?
We'll give you the answer and find out how the game progressed when you dial your mouse to Read More.[Read More]
Some of the problems and study material presented in our weekly columns are, to say the least, somewhat challenging for the average player. But we've always tried to a make a point of having something for everyone, so, once again, we're "taking a break" from the really hard stuff and presenting a problem that is interesting, practical, and not so difficult.
Black is a man up, but not for long. Still, there is a very nice win on the board. Can you see it? We'd rate this problem as no higher than "intermediate" in difficulty, and we're sure the more experienced players will call it "easy." Whatever it may be, can you solve it? The truly easy thing about it: clicking on Read More will lead you right to the solution.[Read More]
A few years ago, Ed Gilbert, the author of the KingsRow computer checkers engine, and the creator of the companion 10-piece endgame database, sent some new play to one of checker's sharpest-eyed analysts, Brian Hinkle. Ed told Brian the following:
"The 5-9 24-20 Double Cross is indeed a draw. This is exciting news to me, since this is a new, unknown draw in a ballot that is generally considered by a lot of players to be the most difficult of the 3-move tournament ballots. This morning I loaded the new opening book into Kingsrow and played along the PV. It dropped out of book at the 40th ply into a very interesting position where White had sacrificed a man to gain a first king with a positional advantage."
Ed showed the following line of play:
A---25-22 4-8 31-27 same.
B---This is the end of the computer's opening book moves. Note that Ed constructed a special opening book that examined the Double Cross in great depth and detail.
Ed comments further, "Every black move from 24-20 up to move 13 is forced."
Here is the position at the end of the KingsRow specialized Double Cross opening book.
Finding the rest of the solution is not an easy task, but you owe it to yourself to give it a try. The solution is not long, but it is very surprising, perhaps ranking among the most surprising things we've ever seen on the checkerboard. After you've done your analysis, click on Read More to see the truly stunning conclusion.[Read More]
The amusement park in Cedar Point, Ohio, isn't known as Checkersland, though it could well have been so named, instead of bearing the much less orignal title of Cedar Point Amusement Park. We're certain that it's a fine amusement park, though we doubt that it honors the history of Cedar Point as the home of a series of high-level championship checker tournaments some decades back. More's the pity; we could envision some sort of checker-themed roller coaster as a major attraction.
However, there really is a Checkersland; but it's a relatively new checker-playing computer program. In recent years we've come across very few new checker playing programs that were intended to be more than toys, but Checkersland certainly is a serious effort. It plays a very wide variety of checker games, running the gamut from American-British "straight" checkers, to Russian, to pool, and even Turkish and Sri Lankan and many more. The graphics are attractive and there are a number of useful features, such as reading PDN, position set-up, and the like. Best of all: Checkersland is coded in Java and so will run on Windows, Linux, Mac, and in fact on just about any computer that boasts a conformant Java implementation. The Checkersland web site can be found here.
We were more than anxious to try out this latest checker-playing effort, and pleased to see that it featured many levels of play, from "easy" right through "impossible." So we carried out our standard test, playing Checkersland at its highest level, "impossible," against Martin Fierz's Simple Checkers We gave Simple Checkers five seconds per move to match up with the time that Checkersland seemed to take on our laboratory test system, Konanekane, a dual-core 2.5 Ghz machine with 4GB of memory.
To make a long story short, our testing showed that Checkersland is a work in progress, at least with respect to "straight" checkers; even at its "impossible" level, it was handily defeated by Simple Checkers.
You can see the full game in animated form, with brief comments, by clicking here. But first, we'd like to show you a position from the game. Checkersland has just made a weak move, and now Simple Checkers, playing Black, can play a winning line.
Our challenge to you is to find the win for Black. It's certainly not at the "impossible" level, but it does require a bit of thought and follow-through. After you've done the possible, or found it not possible, click on Read More to see one possible winning line, or else go back and view the animation to see the solution in the context of the entire game.
We hope that the Russian author of Checkersland continues to work on his product, as we believe it has a lot of unrealized potential. And, by the way, Cedar Point Amusement Park take note: we'd still like some day to ride on a checker-themed roller coaster.[Read More]
Checkers has been solved. This headline has appeared in many news articles in the past few days, as Dr. Johnathan Schaeffer announced on July 19, 2007, that his team at University of Alberta has completed their quest to develop a rigid proof that checkers is, with perfect play on both sides, a drawn game.
This proof is a monumental achievement in computer science and took years of work to develop. The proof is non-trivial, both in complexity and quality; Dr. Schaeffer's team not only proved that checkers is a draw, they've shown how to do it. Given that you have a very powerful computer, an enormous database, and some sophisticated computer code, you can draw all your games from now on.
What's our reaction to this? As computer literates ourselves, we're most impressed with Dr. Schaeffer's work. As checker players and fans, we can only say, "Let's play checkers." In fact, this week your editor himself was quoted in Nature Magazine, when asked about the impact of this new proof on the checker playing public, as saying, "People will keep right on playing checkers."
And, of course, they will.
We note that, while checkers is now known definitively to be a draw, this coming week's U.S. National Tournament hasn't been canceled, and we in fact predict with great confidence that the tournament will produce some really fine games of checkers. Too, we have no plans to discontinue publishing The Checker Maven as we still think there are plenty of checker topics worthy of column space.
While the computer has had a profound influence on our game, mostly (but not entirely) for the good, checkers is in the end a game for people. Real, live people from all walks of life play checkers, and they won't stop now. From country picnic tables all the way to Las Vegas tournament halls, checkers will go on.
A draw? Yes, in theory, the game is a proven draw. In practice, it's still a great deal of fun and a tremendous intellectual challenge, and that's not going to change.
Editor's note: The photo at the top of this article is a file photo and is not of Dr. Schaeffer and his laboratory.