- Troubleshooting
- Local testing
Troubleshootingβ
1. maven related issuesβ
Iceberg JAR file not found. Building with Maven...
usage: mvn [-h] [_ ...]
mvn: error: unrecognized arguments: -Dmaven.test.skip=true
Solution:β
Install the correct version of Maven. Run which mvn
or mvn --version
to verify the Maven installation.
2. AWS region issuesβ
Region must be specified either via environment variable (AWS_REGION) or system property (aws.region)., software.amazon.awssdk.regions.providers.AwsProfileRegionProvider@7a7eb56e: No region provided in profile: default, software.amazon.awssdk.regions.providers.InstanceProfileRegionProvider@241d5e60: Unable to contact EC2 metadata service.
Solutionβ
Set the AWS region explicitly if the error occurs. For example:
export AWS_REGION=us-east-1
Replace us-east-1 with the appropriate region.
3. Linkage errorβ
025-04-02T13:51:25Z ERROR [Java-Iceberg:50051] Error: LinkageError occurred while loading main class io.debezium.server.iceberg.OlakeRpcServer
2025-04-02T13:51:25Z ERROR [Java-Iceberg:50051] java.lang.UnsupportedClassVersionError: io/debezium/server/iceberg/OlakeRpcServer has been compiled by a more recent version of the Java Runtime (class file version 61.0), this version of the Java Runtime only recognizes class file versions up to 55.0
Solutionβ
Ensure that Java version is 17 or higher. Check the installed version with:
java -version
If the version is older, update Java to a supported version. Reload the shell configuration (e.g., source ~/.zshrc) after updating.
4. gRPC Connection Failβ
Cause: The gRPC server is not running or the port configuration is incorrect.
Solutionβ
- Verify that the Java gRPC server is running.
- Confirm that the port specified in destination.json matches the server port.
5. Data Not Appearing in Sparkβ
Cause: Misconfiguration or delayed data propagation.
Solutionβ
- Double-check the
destination.json
configuration. - Connect to the
spark-iceberg
container and run a Spark SQL query.
6. Docker Compose Issuesβ
Cause: Docker or Docker Compose is not properly installed or services failed to start.
Solutionβ
- Verify Docker and Docker Compose are installed and running.
- Use
docker compose ps
to check service statuses.
For catalog-specific issues, refer to the corresponding documentation.β
Local testingβ
Follow the Docker Compose setup to run Glue/Hive/JDBC/REST locally.
- Clone repo and navigate to local test directory
docker compose up
- Create
source.json
anddestination.json
- Run Discover β Sync β Verify via Spark SQL
Key configs are available in tabs on the local setup page. See also: REST catalog permissions and Spark SQL examples.