Scribe
Logging framework claiming to be fastest on JVM. Built from scratch in Scala with programmatic configuration capabilities. Not wrapper around traditional Java logging frameworks, achieves compile-time optimization through macros. Supports real-time configuration changes.
Library
Scribe
Overview
Scribe is a logging framework claiming to be the fastest on the JVM. Built from scratch in Scala with programmatic configuration capabilities. Not a wrapper around traditional Java logging frameworks, achieves compile-time optimization through macros. Supports real-time configuration changes, with adoption expanding in Scala applications where performance is paramount in 2025. Achieves performance surpassing traditional Java-based solutions through pure Scala implementation and macro-based optimization, with selection examples increasing in high-load systems.
Details
The 2025 version of Scribe has established a solid position as the "fastest JVM logging framework." Without depending on traditional Java logging frameworks, it's designed from scratch in Scala, providing a completely different approach to logging. Achieves fast and effective logging without the need for configuration files or additional dependencies. Leverages Scala macros to perform as much optimization as possible at compile time, minimizing the impact of logging processing on performance in production applications.
Key Features
- Fastest JVM Performance: Industry-leading speed through macro-based compile-time optimization
- Pure Scala Implementation: Optimization through complete Scala design without Java dependencies
- Programmatic Configuration: Flexible in-code configuration without configuration files
- Real-time Reconfiguration: Support for dynamic configuration changes at runtime
- Zero Dependencies: Lightweight implementation without additional dependencies
- Cross-platform: Support for JVM, Scala.js, and ScalaNative
Pros and Cons
Pros
- Achieves highest level logging performance in JVM environments
- Complete integration with Scala ecosystem through pure Scala implementation
- Provides programmatic flexibility without configuration files
- Enables dynamic adjustment during operation through real-time configuration changes
- Easy introduction as lightweight, dependency-free standalone library
- Minimizes performance impact in production environments through macro optimization
Cons
- No compatibility with traditional Java logging frameworks
- High learning cost for programmatic configuration for Scala beginners
- Difficult integration with existing Java ecosystem (Logback, Log4j, etc.)
- Limited track record in enterprise environments
- Complex understanding of behavior during debugging due to macro-based approach
- Fewer error handling and fallback features than other frameworks
Reference Pages
Usage Examples
Installation and Basic Setup
// build.sbt
libraryDependencies += "com.outr" %% "scribe" % "3.16.1"
// Cross-platform projects (JVM, JS, Native)
libraryDependencies += "com.outr" %%% "scribe" % "3.16.1"
// When SLF4J interoperability is needed
libraryDependencies += "com.outr" %% "scribe-slf4j" % "3.16.1"
// When JSON output support is needed
libraryDependencies += "com.outr" %% "scribe-json" % "3.16.1"
Zero Import, Zero Mixin Logging
// Direct usage without imports or trait mixins
class MyApplication {
def start(): Unit = {
scribe.info("Application starting")
doSomethingImportant()
scribe.info("Application processing completed")
}
def doSomethingImportant(): Unit = {
scribe.debug("Executing important process")
try {
// Business logic
processData()
scribe.info("Data processing completed successfully")
} catch {
case ex: Exception =>
scribe.error("Error occurred during data processing", ex)
throw ex
}
}
private def processData(): Unit = {
scribe.trace("Detailed trace information")
Thread.sleep(100) // Processing simulation
}
}
// Usage example - works immediately without configuration
object ZeroConfigExample extends App {
val app = new MyApplication()
app.start()
}
// Output example:
// 2025.01.02 19:05:47:342 [main] INFO MyApplication:5 - Application starting
// 2025.01.02 19:05:47.342 [main] DEBUG MyApplication.doSomethingImportant:12 - Executing important process
// 2025.01.02 19:05:47.450 [main] INFO MyApplication:17 - Data processing completed successfully
// 2025.01.02 19:05:47.451 [main] INFO MyApplication:7 - Application processing completed
Traditional Logger Approach
import scribe.Logger
class TraditionalLoggingService {
// Explicit logger creation
val logger: Logger = Logger("TraditionalLoggingService")
def performOperation(operationId: String): Unit = {
logger.info(s"Operation started: $operationId")
val startTime = System.currentTimeMillis()
try {
executeBusinessLogic(operationId)
val duration = System.currentTimeMillis() - startTime
logger.info(s"Operation completed: $operationId (duration: ${duration}ms)")
} catch {
case ex: Exception =>
logger.error(s"Operation failed: $operationId", ex)
throw ex
}
}
private def executeBusinessLogic(operationId: String): Unit = {
logger.debug(s"Executing business logic: $operationId")
// Complex processing simulation
Thread.sleep(scala.util.Random.nextInt(200) + 50)
if (scala.util.Random.nextDouble() < 0.1) {
throw new RuntimeException(s"Processing error: $operationId")
}
logger.debug(s"Business logic completed: $operationId")
}
}
// Usage example
object TraditionalExample extends App {
val service = new TraditionalLoggingService()
for (i <- 1 to 5) {
try {
service.performOperation(s"op_$i")
} catch {
case _: Exception =>
// Error already logged
}
}
}
Programmatic Configuration and Customization
import scribe._
import scribe.format._
import scribe.writer.FileWriter
object ConfigurableLoggingExample extends App {
// Log level configuration
Logger.root
.clearHandlers()
.clearModifiers()
.withHandler(minimumLevel = Some(Level.Debug))
.replace()
// Create custom formatter
val customFormatter: Formatter = formatter"[$threadName] $positionAbbreviated - $message$newLine"
// Console output configuration
Logger.root
.clearHandlers()
.withHandler(formatter = customFormatter)
.replace()
// File output configuration
val fileLogger = Logger("file-logger")
.withHandler(
writer = FileWriter("logs" / ("app-" % year % "-" % month % "-" % day % ".log")),
formatter = formatter"$date $levelPaddedRight $classNameAbbreviated.$methodName:$line - $message$newLine"
)
.replace()
// Logger configuration with multiple outputs
val multiOutputLogger = Logger("multi-output")
.withHandler(
writer = scribe.writer.ConsoleWriter,
formatter = formatter"[CONSOLE] $levelPaddedRight - $message$newLine"
)
.withHandler(
writer = FileWriter("logs" / "debug.log"),
formatter = formatter"$date [FILE] $levelPaddedRight $classNameAbbreviated - $message$newLine",
minimumLevel = Some(Level.Debug)
)
.replace()
// Test log output
scribe.info("Log output through programmatic configuration")
scribe.debug("Debug level information")
scribe.warn("Warning message")
scribe.error("Error message")
// File logger output
fileLogger.info("Information recorded to file")
// Multi-output logger output
multiOutputLogger.info("Output to both console and file")
multiOutputLogger.debug("Debug information also recorded to file")
}
Real-time Configuration Changes and Dynamic Control
import scribe._
import scala.concurrent.Future
import scala.concurrent.ExecutionContext.Implicits.global
import scala.util.{Success, Failure}
class DynamicLoggingService {
private var currentLogLevel: Level = Level.Info
def adjustLogLevel(newLevel: Level): Unit = {
currentLogLevel = newLevel
// Real-time log level change
Logger.root
.clearHandlers()
.withHandler(minimumLevel = Some(newLevel))
.replace()
scribe.info(s"Log level changed to ${newLevel.name}")
}
def simulateHighLoadOperation(): Future[String] = {
scribe.info("High load operation started")
// Increase log level during high load for performance priority
val originalLevel = currentLogLevel
adjustLogLevel(Level.Warn)
val operation = Future {
for (i <- 1 to 10000) {
// Large amount of processing (debug logs not output)
scribe.debug(s"Processing: $i")
if (i % 1000 == 0) {
scribe.info(s"Progress: $i/10000")
}
}
"High load operation completed"
}
operation.onComplete {
case Success(result) =>
scribe.info(result)
// Return to original log level
adjustLogLevel(originalLevel)
case Failure(exception) =>
scribe.error("High load operation failed", exception)
adjustLogLevel(originalLevel)
}
operation
}
def demonstrateConditionalLogging(): Unit = {
val isProduction = false // Environment setting
if (isProduction) {
// Minimal logging in production environment
Logger.root
.clearHandlers()
.withHandler(minimumLevel = Some(Level.Warn))
.replace()
} else {
// Detailed logging in development environment
Logger.root
.clearHandlers()
.withHandler(minimumLevel = Some(Level.Trace))
.replace()
}
scribe.trace("Detailed trace information")
scribe.debug("Debug information")
scribe.info("General information")
scribe.warn("Warning")
scribe.error("Error")
}
}
object DynamicLoggingExample extends App {
val service = new DynamicLoggingService()
// Dynamic log level change demo
service.adjustLogLevel(Level.Debug)
service.demonstrateConditionalLogging()
// Performance priority configuration for high load operations
import scala.concurrent.duration._
import scala.concurrent.Await
val future = service.simulateHighLoadOperation()
Await.result(future, 10.seconds)
println("Dynamic logging configuration demo completed")
}
Structured Logging and Object Logging
import scribe._
import scribe.format._
// Logging support for custom objects
case class User(id: Long, name: String, email: String, role: String)
case class RequestInfo(method: String, path: String, duration: Long, statusCode: Int)
// Loggable instance for custom objects
implicit val userLoggable: Loggable[User] = new Loggable[User] {
override def apply(user: User): LogOutput = {
LogOutput.empty
.add("id" -> user.id)
.add("name" -> user.name)
.add("email" -> user.email)
.add("role" -> user.role)
}
}
implicit val requestInfoLoggable: Loggable[RequestInfo] = new Loggable[RequestInfo] {
override def apply(request: RequestInfo): LogOutput = {
LogOutput.empty
.add("method" -> request.method)
.add("path" -> request.path)
.add("duration" -> s"${request.duration}ms")
.add("status" -> request.statusCode)
}
}
class StructuredLoggingService {
def processUserRequest(user: User, requestInfo: RequestInfo): Unit = {
// Log entire object
scribe.info(s"User request processing started", user)
// Include multiple objects in log
scribe.info("Request details",
Map(
"user" -> user,
"request" -> requestInfo
)
)
// Performance measurement
val startTime = System.currentTimeMillis()
try {
handleBusinessLogic(user, requestInfo)
val actualDuration = System.currentTimeMillis() - startTime
val updatedRequest = requestInfo.copy(duration = actualDuration, statusCode = 200)
scribe.info("Request processing completed", updatedRequest)
} catch {
case ex: Exception =>
val actualDuration = System.currentTimeMillis() - startTime
val errorRequest = requestInfo.copy(duration = actualDuration, statusCode = 500)
scribe.error("Request processing error", Map(
"request" -> errorRequest,
"error" -> ex.getMessage,
"user" -> user
))
}
}
private def handleBusinessLogic(user: User, request: RequestInfo): Unit = {
scribe.debug(s"Business logic execution: ${request.method} ${request.path}")
request.path match {
case "/api/profile" =>
scribe.info("Profile retrieval process", Map("userId" -> user.id))
Thread.sleep(50)
case "/api/settings" =>
scribe.info("Settings update process", Map("userId" -> user.id, "role" -> user.role))
Thread.sleep(100)
case path if path.startsWith("/api/admin") =>
if (user.role != "admin") {
throw new SecurityException("Administrator privileges required")
}
scribe.warn("Administrator operation execution", Map("userId" -> user.id, "path" -> path))
Thread.sleep(200)
case _ =>
scribe.warn("Unknown endpoint", Map("path" -> request.path))
}
}
}
object StructuredLoggingExample extends App {
// JSON format output configuration
Logger.root
.clearHandlers()
.withHandler(
formatter = formatter"$date $level $classNameAbbreviated.$methodName:$line - $message$mdc$newLine"
)
.replace()
val service = new StructuredLoggingService()
val users = List(
User(1001, "Taro Tanaka", "[email protected]", "user"),
User(1002, "Hanako Sato", "[email protected]", "admin"),
User(1003, "Ichiro Suzuki", "[email protected]", "user")
)
val requests = List(
RequestInfo("GET", "/api/profile", 0, 0),
RequestInfo("POST", "/api/settings", 0, 0),
RequestInfo("DELETE", "/api/admin/users", 0, 0),
RequestInfo("GET", "/api/unknown", 0, 0)
)
for {
user <- users
request <- requests.take(2) // 2 requests per user
} {
service.processUserRequest(user, request)
}
println("Structured logging demo completed")
}
Performance Testing and Benchmarking
import scribe._
import org.slf4j.LoggerFactory
import java.util.concurrent.{Executors, TimeUnit}
import scala.concurrent.{ExecutionContext, Future}
import scala.util.Random
class ScribePerformanceBenchmark {
def benchmarkSingleThreaded(iterations: Int = 1000000): Unit = {
println(s"Single-threaded benchmark ($iterations iterations)")
// Warmup
warmup()
// Scribe benchmark
val scribeTime = measureTime {
for (i <- 1 to iterations) {
scribe.debug(s"Scribe message $i with value ${Random.nextInt(1000)}")
}
}
// SLF4J comparison
val slf4jLogger = LoggerFactory.getLogger("benchmark")
val slf4jTime = measureTime {
for (i <- 1 to iterations) {
if (slf4jLogger.isDebugEnabled) {
slf4jLogger.debug(s"SLF4J message $i with value ${Random.nextInt(1000)}")
}
}
}
println(f"Scribe: ${scribeTime}%,d ms")
println(f"SLF4J: ${slf4jTime}%,d ms")
if (slf4jTime > 0) {
val improvement = ((slf4jTime.toDouble - scribeTime.toDouble) / slf4jTime * 100)
println(f"Scribe performance improvement: ${improvement}%.1f%%")
}
}
def benchmarkMultiThreaded(threads: Int = 10, iterations: Int = 100000): Unit = {
println(s"Multi-threaded benchmark (${threads} threads, $iterations iterations per thread)")
val executor = Executors.newFixedThreadPool(threads)
implicit val ec: ExecutionContext = ExecutionContext.fromExecutor(executor)
// Scribe multi-threaded test
val scribeStartTime = System.currentTimeMillis()
val scribeFutures = (1 to threads).map { threadId =>
Future {
for (i <- 1 to iterations) {
scribe.info(s"Thread $threadId message $i")
}
}
}
import scala.concurrent.Await
import scala.concurrent.duration._
Await.result(Future.sequence(scribeFutures), 30.seconds)
val scribeTime = System.currentTimeMillis() - scribeStartTime
println(f"Scribe multi-threaded: ${scribeTime}%,d ms")
println(f"Throughput: ${(threads * iterations * 1000.0 / scribeTime)}%,.0f messages/sec")
executor.shutdown()
executor.awaitTermination(5, TimeUnit.SECONDS)
}
def benchmarkAsyncLogging(): Unit = {
println("Asynchronous logging benchmark")
// Asynchronous configuration
Logger.root
.clearHandlers()
.withHandler(
writer = scribe.writer.ConsoleWriter,
minimumLevel = Some(Level.Info)
)
.replace()
val iterations = 500000
val startTime = System.currentTimeMillis()
for (i <- 1 to iterations) {
scribe.info(s"Async message $i")
}
// Wait for all logs to be processed
Thread.sleep(1000)
val asyncTime = System.currentTimeMillis() - startTime
println(f"Asynchronous logging: ${asyncTime}%,d ms")
println(f"Throughput: ${(iterations * 1000.0 / asyncTime)}%,.0f messages/sec")
}
private def warmup(): Unit = {
for (_ <- 1 to 10000) {
scribe.debug("warmup")
}
}
private def measureTime(operation: => Unit): Long = {
val startTime = System.currentTimeMillis()
operation
System.currentTimeMillis() - startTime
}
}
object PerformanceBenchmarkRunner extends App {
// Measure performance with DEBUG level disabled
Logger.root
.clearHandlers()
.withHandler(minimumLevel = Some(Level.Info))
.replace()
val benchmark = new ScribePerformanceBenchmark()
benchmark.benchmarkSingleThreaded(2000000)
println()
benchmark.benchmarkMultiThreaded(threads = 8, iterations = 250000)
println()
benchmark.benchmarkAsyncLogging()
println("\nScribe performance test completed")
}