Code-Driven Documentation: How to Eliminate Errors in Configuration Documentation
This document outlines how to utilize code to enhance the accuracy of configuration documentation, particularly for technical writers who might not be well-versed in coding but are responsible for creating or reviewing configuration files. By following the examples in this document, writers can ensure that their configuration files are correct, efficient, and easy to understand, while also reducing the risk of errors or inconsistencies.
Documentation-driven development emphasizes the importance of creating documentation that meets the needs of users. Code-driven documentation here prioritizes the accuracy and consistency of documentation by relying on the source of truth, which is the code itself. By using code to verify documentation, technical writers can ensure that their documentation is up-to-date and correctly reflects the behavior of the system, which can help to reduce errors and improve the overall quality of the documentation.
The following uses some configuration files as examples to illustrate how to use code to enhance the accuracy of configuration documentation. For example, the following configuration files are used:
Product | Configuration file | Language | Example |
---|---|---|---|
TiDB | config.toml | Go | Example 1 |
TiKV | config.toml | Rust | Example 2 |
TiFlash | tiflash.toml | C++ | Example 3 |
PD | config.toml | Go | Example 4 |
Example 1: TiDB configuration file & config.go
The TiDB configuration file is written in TOML format. The documentation is TiDB configuration file.
Steps
The following takes log.level
as an example:
## Log
Configuration items related to log.
### `level`
+ Specifies the log output level.
+ Value options: `debug`, `info`, `warn`, `error`, and `fatal`.
+ Default value: `info`
-
Create a shallow clone of the TiDB repository:
git clone https://github.com/pingcap/tidb.git --depth=1 tidb
-
Search for
"level"
ortoml:"level"
in thetidb
folder. The following usesfind
andgrep
commands to search and list all files that contain the"level"
keyword:- Command
- Output
cd tidb
find . | grep -l -r '"level"'./config/config.go
./planner/core/memtable_predicate_extractor.go
./dumpling/log/log.go
./parser/parser_test.go
./parser/parser.go
./infoschema/metric_table_def.go
./br/pkg/lightning/tikv/tikv.go
./br/pkg/lightning/lightning.go
./br/pkg/lightning/log/log.go
./br/pkg/lightning/restore/precheck_impl_test.go
./sessionctx/stmtctx/stmtctx.goFrom the preceding output, you can skip the
./parser/parser_test.go
and./br/pkg/lightning/restore/precheck_impl_test.go
files because they are test files. Then, search for the"level"
keyword again:- Command
- Output
find . | grep -r '"level"' --exclude "*_test.go"
./config/config.go: Level string `toml:"level" json:"level"`
./planner/core/memtable_predicate_extractor.go: remained, levlSkipRequest, logLevels := e.extractCol(schema, names, remained, "level", true)
./dumpling/log/log.go: Level string `toml:"level" json:"level"`
./parser/parser.go: "level",
./infoschema/metric_table_def.go: Labels: []string{"instance", "level", "db"},
./infoschema/metric_table_def.go: Labels: []string{"instance", "cf", "level", "db"},
./br/pkg/lightning/tikv/tikv.go: task := log.With(zap.Int32("level", level), zap.String("tikv", tikvAddr)).Begin(zap.InfoLevel, "compact cluster")
./br/pkg/lightning/lightning.go: Level zapcore.Level `json:"level"`
./br/pkg/lightning/log/log.go: Level string `toml:"level" json:"level"`
./sessionctx/stmtctx/stmtctx.go: Level string `json:"level"`Then, you can see the context of the
"level"
keyword in other files:- File 1
- File 2
- File 3
- File 4
- File 5
- File 6
- File 7
- File 8
- File 9
config/config.gotype Log struct {
// Log level.
Level string `toml:"level" json:"level"`
// ...
}planner/core/memtable_predicate_extractor.gofunc (e *ClusterLogTableExtractor) Extract(
ctx sessionctx.Context,
schema *expression.Schema,
names []*types.FieldName,
predicates []expression.Expression,
) []expression.Expression {
// ...
remained, levlSkipRequest, logLevels := e.extractCol(schema, names, remained, "level", true)
e.SkipRequest = typeSkipRequest || addrSkipRequest || levlSkipRequest
// ...dumpling/log/log.go// Config serializes log related config in toml/json.
type Config struct {
// Log level.
// One of "debug", "info", "warn", "error", "dpanic", "panic", and "fatal".
Level string `toml:"level" json:"level"`
// ...
}
}parser/parser.goyySymNames = []string{
// ...
"language",
"level",
"list",
// ...
}infoschema/metric_table_def.govar MetricTableMap = map[string]MetricTableDef{
// ...
"tikv_compression_ratio": {
PromQL: `avg(tikv_engine_compression_ratio{$LABEL_CONDITIONS}) by (level,instance,db)`,
Labels: []string{"instance", "level", "db"},
Comment: "The compression ratio of each level",
},
// ...
"tikv_number_files_at_each_level": {
PromQL: `avg(tikv_engine_num_files_at_level{$LABEL_CONDITIONS}) by (cf, level,db,instance)`,
Labels: []string{"instance", "cf", "level", "db"},
Comment: "The number of SST files for different column families in each level",
},
// ...
}br/pkg/lightning/tikv/tikv.go// Compact performs a leveled compaction with the given minimum level.
func Compact(ctx context.Context, tls *common.TLS, tikvAddr string, level int32) error {
task := log.With(zap.Int32("level", level), zap.String("tikv", tikvAddr)).Begin(zap.InfoLevel, "compact cluster")
// ...
}br/pkg/lightning/lightning.gofunc handleLogLevel(w http.ResponseWriter, req *http.Request) {
w.Header().Set("Content-Type", "application/json")
var logLevel struct {
Level zapcore.Level `json:"level"`
}
// ...
}br/pkg/lightning/log/log.gotype Config struct {
// Log level.
Level string `toml:"level" json:"level"`
// Log filename, leave empty to disable file log.
File string `toml:"file" json:"file"`
// ...
}sessionctx/stmtctx/stmtctx.gotype jsonSQLWarn struct {
Level string `json:"level"`
SQLErr *terror.Error `json:"err,omitempty"`
Msg string `json:"msg,omitempty"`
} -
Verify the data type:
config/config.gotype Log struct {
// Log level.
Level string `toml:"level" json:"level"`
// ...
}The
level
item is defined in theLog
struct, the variable name isLevel
, and the type isstring
. Then, you can verify whether the type oflog.level
in the document is consistent with the type ofLevel
in the code. -
To verify the default value, search for
Level
in theconfig/config.go
file, and you can find the default value ofLevel
is"info"
:config/config.govar defaultConf = Config{
Host: DefHost,
AdvertiseAddress: "",
Port: DefPort,
// ...
Log: Log{
Level: "info",
Format: "text",
// ...
},
// ...
} -
To verify whether
level
is in thelog
table or not, search for"log"
in theconfig.go
file. You can find the following:config/config.gotype Config struct {
Host string `toml:"host" json:"host"`
// ...
Log Log `toml:"log" json:"log"`
// ...
}
// ...
type Log struct {
// Log level.
Level string `toml:"level" json:"level"`
// ...
}The
level
is defined in theLog
struct, that is,level
is in thelog
table.
Conclusion
In the config/config.go
, you can verify the following information of CONFIG-NAME
by searching"CONFIG-NAME"
:
- The type of a configuration item.
- The default value of a configuration item.
- The table that a configuration item belongs to.
Example 2: TiKV configuration file & config.rs
The TiKV configuration file is written in TOML format. The documentation is TiKV configuration file.
Steps
The following takes raftstore.right-derive-when-split
as an example:
-
Create a shallow clone of the TiKV repository:
git clone https://github.com/tikv/tikv.git --depth=1 tikv
-
Search for
right(.*)derive(.*)when(.*)split
in thetikv
folder. -
Verify the data type in the
components/raftstore/src/store/config.rs
file:The configuration item is defined as
pub right_derive_when_split: bool
in theConfig
struct. The type isbool
. Then, you can verify whether the type ofraftstore.right-derive-when-split
in the document is consistent with the type in the code.components/raftstore/src/store/config.rsstruct Config {
// Right region derive origin region id when split.
#[online_config(hidden)]
pub right_derive_when_split: bool,
// ...
}