I am trying to implement an artificial intelligence player for Othello using the Minimax algorithm. The computer plays decently, but its not great. Did I implement it correctly in my following code?
Coordinate bestCoordinate = null;
public int minimax(MyButton[][] gameBoard, int depth, boolean maximizingPlayer) {
if (depth == 0) {
return evaluateBoard(gameBoard);
}
if (maximizingPlayer) {
int bestValue = Integer.MIN_VALUE;
LinkedList<Coordinate> moves = generateMoves(gameBoard);
for (Coordinate move : moves) {
MyButton[][] newBoard = cloneBoard(gameBoard);
processMove(newBoard, newBoard[move.getxCoordinate()][move.getyCoordinate()]);
int v = minimax(newBoard, depth - 1, !maximizingPlayer);
if (v > bestValue) {
bestValue = v;
bestCoordinate = move;
}
}
return bestValue;
}
else {
int bestValue = Integer.MAX_VALUE;
LinkedList<Coordinate> moves = generateMoves(gameBoard);
for (Coordinate move : moves) {
MyButton[][] newBoard = cloneBoard(gameBoard);
processMove(newBoard, newBoard[move.getxCoordinate()][move.getyCoordinate()]);
int v = minimax(newBoard, depth - 1, !maximizingPlayer);
if (v < bestValue) {
bestValue = v;
bestCoordinate = move;
}
}
return bestValue;
}
}
Also, here is my evaluation function:
public int evaluateBoard(MyButton[][] gameBoard) {
int blackPieces = 0;
int whitePiecess = 0;
for (MyButton[] array : gameBoard) {
for (MyButton button : array) {
if (button.getBackground().equals(Color.black)) {
blackPieces++;
} else if (button.getBackground().equals(Color.WHITE)) {
whitePiecess++;
}
}
}
int cornerBonus = 10;
if (gameBoard[0][0].getBackground().equals(Color.BLACK)) {
blackPieces += cornerBonus;
}
if (gameBoard[0][getBoardWidth() - 1].getBackground().equals(Color.BLACK)) {
blackPieces += cornerBonus;
}
if (gameBoard[getBoardHeight() - 1][0].getBackground().equals(Color.BLACK)) {
blackPieces += cornerBonus;
}
if (gameBoard[getBoardHeight() - 1][getBoardWidth() - 1].getBackground().equals(Color.BLACK)) {
blackPieces += cornerBonus;
}
if (gameBoard[0][0].getBackground().equals(Color.WHITE)) {
whitePiecess += cornerBonus;
}
if (gameBoard[0][getBoardWidth() - 1].getBackground().equals(Color.WHITE)) {
whitePiecess += cornerBonus;
}
if (gameBoard[getBoardHeight() - 1][0].getBackground().equals(Color.WHITE)) {
whitePiecess += cornerBonus;
}
if (gameBoard[getBoardHeight() - 1][getBoardWidth() - 1].getBackground().equals(Color.WHITE)) {
whitePiecess += cornerBonus;
}
return whitePiecess - blackPieces;
}
(The computer always plays white, and the human is black).
I’m mainly unsure because the computer doesn’t seem to protect corners, despite the bonus points that they give. Is there anything wrong with my code/logic?
2
Answers
You are updating your best move at each depth. Make a constant called SEARCH_DEPTH outside of your function that you use every time you call the function and do an if check:
Also, assuming you are the maximizing player, you only want to set the move in the
if(maximizingPlayer)
block.I did not test your code out myself, but that is the minimax algorithm, and it appears to be written correctly (assuming your helper functions are implemented correctly). I have some points that might give you insight as to why your agent is not acting optimally: