Imported blogs from Qiita
2020-03-28
QiitaRubyQiita began to provide a feature to show what types of articles I write and put +1. At that time, they made when I read public without opt-in. ユーザーページをリニューアルしました - Qiita Blog
I'd like not to make my browsing histories public, so I was going to delete my own account from qiita.com.
Although the feature was removed, I thought migrating my articles into here is good one to gather all of my articles in a single place.
Qiita: https://qiita.com/petitviolet
As the result of the gathering, there are many posts before HelloWorld post which I wrote as the first article in this blog :D
How to import
At first, download my own contents from qiita.com via API.
qiita
rubygem is to call the APIs, and it's easy to download articles as JSON format.
$ gem install qiita
$ qiita list_user_items petitviolet per_page=100 > qiita.json
And then, write a ruby script to format for gatsby-transformer-remark.
require 'json'
require 'fileutils'
require "open-uri"
require 'date'
contents = JSON.parse(File.read('./qiita.json'))
contents.each_with_index do |content, i|
title = content["title"].gsub(/\[(.*)\]/, '\1 - ')
datetime = DateTime.parse(content["created_at"])
markdown = content["body"]
# sanitize description
description = markdown.slice(0, 100)
.gsub("\n", " ")
.gsub("#", "")
.gsub(" ", "")
.gsub('"', '\"')
.gsub(/\[(.+?)\]\(.+?\)/, '\1')
.gsub('=', '')
.gsub(/```/, ' ')
.gsub(/`/, '')
.slice(0, 40)
.strip
tags = content["tags"].map { |h| "\"#{h["name"]}\"" }
original_url = content["url"]
body = <<~EOS
---
title: #{title}
date: "#{datetime.to_s}"
description: "#{description}"
tags: ["Qiita", #{tags.join(", ")}]
---
#{markdown}
from: #{original_url}
EOS
image_urls = body.scan(/https:\/\/qiita-image-store[^)]*?.png/)
image_urls.each do |image_url|
image_name = "qiita_#{i}_#{image_url[/([^\/]*?.png)/]}"
open(image_url) do |image|
File.open("./qiita/#{image_name}", "wb") do |file|
file.write(image.read)
end
system("AWS_PROFILE=xxx /usr/local/bin/s3_image upload ./qiita/#{image_name}")
puts "uploaded #{image_url} to https://static.petitviolet.net/image/#{image_name}"
end
body.gsub!(image_url, "https://static.petitviolet.net/image/#{image_name}")
end
dirname = "./content/blog/#{datetime.to_date.to_s}/#{title.downcase.gsub(' - ', '-').gsub(" ", "-").gsub("/", "_")}"
filename = "#{dirname}/index.md"
puts "filename: #{filename}"
FileUtils.mkdir(dirname) rescue puts "error in mkdir #{dirname}: #{$!.inspect}"
File.delete(filename) rescue puts "error in deleting #{filename}: #{$!.inspect}"
File.open(filename, 'w') { |f| f.write(body) }
end
This script downloads all images hosted by qiita, and uploads to my own S3 bucket using s3_image.sh.