Skip to main content

Working with exported data


Working with exported data

Originally shared by Filip H.F. “FiXato” Slagter

Progress on Takeout Google+ Stream Posts Export Tool
I've been making some progress with the Google Data Takeout conversion part of my #PlexodusTools project. I can currently let one of the scripts parse all the JSON files in the extracted Takeout/Google+ Stream/Posts directory, and convert the JSON to simplified HTML files in an (imho) logical directory structure, based on what kind of posts it is, and what its privacy setting is:
data/output/html/exported_activities/limited/communities/$communityID-$communityName/$userID
data/output/html/exported_activities/limited/events/$eventID-$eventName/$userID
data/output/html/exported_activities/limited/posts
data/output/html/exported_activities/limited/posts/$userID/circles/$circleName
data/output/html/exported_activities/limited/posts/$userID/collections/$collectionID-$collectionName
data/output/html/exported_activities/limited/posts/$userID/users/$userId-$userName
data/output/html/exported_activities/public/posts/$userID/public

Currently the script only outputs the comments in the JSON file, but hopefully I'll have a basic view of the actual post content as well by tomorrow.
I'll try to keep the HTML as simple as possible, so users can restructure and restyle it as they see fit.

Once I've laid this foundation, I can add different templates for different output formats, for instance Atom for import into Blogger.

You can follow the changes at https://github.com/FiXato/Plexodus-Tools/commits/master

#GooglePlus #GPlus #Plexodus #GooglePlusExodus #GPlusExodus #GooglePlusShutdown #GPlusShutdown #Development

Comments

”go"