Skip to content

Commit fd0aef2

Browse files
committed
fix dead links
1 parent 0ff93bf commit fd0aef2

File tree

3 files changed

+3
-3
lines changed
  • blog
  • docs/user-guide/ingest-data/for-observerbility
  • i18n/zh/docusaurus-plugin-content-docs/current/user-guide/ingest-data/for-observerbility

3 files changed

+3
-3
lines changed

blog/release-0-7-2.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ date: 2024-04-08
66

77
Release date: April 08, 2024
88

9-
This is a patch release, containing a critial bug fix to avoid wrongly delete data files ([#3635](https://github.com/GreptimeTeam/greptimedb/pull/3635)).
9+
This is a patch release, containing a critical bug fix to avoid wrongly delete data files ([#3635](https://github.com/GreptimeTeam/greptimedb/pull/3635)).
1010

1111
**It's highly recommended to upgrade to this version if you're using v0.7.**
1212

docs/user-guide/ingest-data/for-observerbility/kafka.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -76,7 +76,7 @@ A pipeline processes the logs into structured data before ingestion into Greptim
7676
### Logs with JSON format
7777

7878
For logs in JSON format (e.g., `{"timestamp": "2024-12-23T10:00:00Z", "level": "INFO", "message": "Service started"}`),
79-
you can use the built-in [`greptime_identity`](/logs/manage-pipelines.md#greptime_identity) pipeline for direct ingestion.
79+
you can use the built-in [`greptime_identity`](/user-guide/logs/manage-pipelines.md#greptime_identity) pipeline for direct ingestion.
8080
This pipeline creates columns automatically based on the fields in your JSON log message.
8181

8282
Simply configure Vector's `transforms` settings to parse the JSON message and use the `greptime_identity` pipeline as shown in the following example:

i18n/zh/docusaurus-plugin-content-docs/current/user-guide/ingest-data/for-observerbility/kafka.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -75,7 +75,7 @@ Pipeline 在写入到 GreptimeDB 之前将日志处理为结构化数据。
7575
### JSON 格式的日志
7676

7777
对于 JSON 格式的日志(例如 `{"timestamp": "2024-12-23T10:00:00Z", "level": "INFO", "message": "Service started"}`),
78-
你可以使用内置的 [`greptime_identity`](/logs/manage-pipelines.md#greptime_identity) pipeline 直接写入日志。
78+
你可以使用内置的 [`greptime_identity`](/user-guide/logs/manage-pipelines.md#greptime_identity) pipeline 直接写入日志。
7979
此 pipeline 根据 JSON 日志消息中的字段自动创建列。
8080

8181
你只需要配置 Vector 的 `transforms` 设置以解析 JSON 消息,并使用 `greptime_identity` pipeline,如以下示例所示:

0 commit comments

Comments
 (0)