google-cloud-platform - Automated BigTable backups
问题描述
A BigTable table can be backed up through GCP for up to 30 days. (https://cloud.google.com/bigtable/docs/backups)
Is it possible to have a custom automatic backup policy?
i.e. trigger automatic backups every X days & keep up to 3 copies at a time.
解决方案
As mentioned in the comment, the link provides a solution which involves the use of the following GCP Products:
Cloud Scheduler: trigger tasks with a cron-based schedule
Cloud Pub/Sub: pass the message request from Cloud Scheduler to Cloud Functions
Cloud Functions: initiate an operation for creating a Cloud Bigtable backup
Cloud Logging and Monitoring (optional).
Full guide can also be seen on GitHub.
This is a good solution since you have a certain requirement that should be done with client libraries, because Big Table doesn't have an API that sets 3 copies at a time.
For normal use cases however, such as triggering automatic backups every X days, there's another solution such as calling the backups.create
directly by creating a Cloud Scheduler with HTTP similar to what's done in this answer.
推荐阅读
- javascript - 如何减小 next.js 中 style.chunk.css 的大小?
- apache-kafka - 微服务——kafka、API网关、lambda的作用
- arrays - 我总是得到最后的结果
- laravel - 如何在 laravel 应用程序中使用 docker 容器启动运行 websockets
- javascript - 打字稿:使用 UNION 运算符时出错
- cakephp - Cakephp 4:尝试静态方法后缺少路线
- python - 如何将时间存储为上午 9:00 并开始在其中添加分钟,以便它像时钟时间一样添加
- reactjs - 声明变量时使用箭头函数而不是赋值有什么好处?
- bash - 重击;如何从文件中的一行文本中检索“_foo”之后的所有内容?
- javascript - 如何使用在javascript中选中的复选框将表行附加到一个表到另一个表