Skip to content
GitLab
Explore
Sign in
Primary navigation
Search or go to…
Project
Duniter Datapod
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Wiki
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Snippets
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Package Registry
Container Registry
Model registry
Operate
Environments
Terraform modules
Monitor
Incidents
Service Desk
Analyze
Value stream analytics
Contributor analytics
CI/CD analytics
Repository analytics
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
nodes
Duniter Datapod
Commits
6dc4598f
Commit
6dc4598f
authored
11 months ago
by
Hugo Trentesaux
Browse files
Options
Downloads
Patches
Plain Diff
add intermediate files
parent
728dfdac
No related branches found
Branches containing commit
No related tags found
No related merge requests found
Changes
3
Hide whitespace changes
Inline
Side-by-side
Showing
3 changed files
.gitignore
+3
-1
3 additions, 1 deletion
.gitignore
input/.gitkeep
+0
-0
0 additions, 0 deletions
input/.gitkeep
src/scripts/cesium-plus-import.ts
+58
-19
58 additions, 19 deletions
src/scripts/cesium-plus-import.ts
with
61 additions
and
20 deletions
.gitignore
+
3
−
1
View file @
6dc4598f
...
...
@@ -8,4 +8,6 @@ node_modules
*.local
# Env
.env
\ No newline at end of file
.env
input/*
\ No newline at end of file
This diff is collapsed.
Click to expand it.
input/.gitkeep
0 → 100644
+
0
−
0
View file @
6dc4598f
This diff is collapsed.
Click to expand it.
src/scripts/cesium-plus-import.ts
+
58
−
19
View file @
6dc4598f
...
...
@@ -6,7 +6,8 @@ import {
cplusRawIrCID
,
cplusIrToAMT
}
from
'
../cesium-plus
'
import
*
as
fs
from
'
fs/promises
'
import
{
createInterface
}
from
'
readline
'
import
{
appendFile
,
createReadStream
}
from
'
fs
'
import
{
timestampToKey
}
from
'
../processor
'
// profile files
...
...
@@ -61,17 +62,15 @@ async function fetchRawCplus(id: string): Promise<string> {
return
fetch
(
ENDPOINT
+
'
/user/profile/
'
+
id
+
'
/_source
'
).
then
((
b
)
=>
b
.
text
())
}
/// download all c+ data and add them to kubo
async
function
getAllCplusIr
():
Promise
<
Array
<
[
string
,
CID
]
>>
{
/// download all c+ data and add them to a file
// strange behavior: all lines are written (`wc -l input/cplusimport.jsonl`) way before logging finishes
async
function
getAllCplusIr
()
{
const
filename
=
'
./input/cplusimport.jsonl
'
const
SCROLL_TIME
=
'
5m
'
const
PAGE_SIZE
=
100
const
PAGE_SIZE
=
100
0
const
ENDPOINT
=
'
https://g1.data.e-is.pro
'
const
cids
:
Array
<
Promise
<
[
string
,
CID
]
>>
=
[]
const
URL
=
ENDPOINT
+
'
/user/profile/_search?scroll=
'
+
SCROLL_TIME
const
decoder
=
new
TextDecoder
()
// first batch
let
batch
=
await
fetch
(
URL
+
'
&_source=false&filter_path=took,_scroll_id,hits.hits._id,hits.total
'
,
{
method
:
'
post
'
,
...
...
@@ -88,22 +87,17 @@ async function getAllCplusIr(): Promise<Array<[string, CID]>> {
console
.
log
(
`importing
${
total
}
cplus profiles...`
)
// process batches while available
while
(
batch
.
took
>
0
)
{
while
(
totalimported
<
total
)
{
// add to the list
for
(
const
hit
of
batch
.
hits
.
hits
)
{
cids
.
push
(
fetchRawCplus
(
hit
.
_id
)
.
then
((
cplusRaw
)
=>
[
cplusRaw
,
JSON
.
parse
(
cplusRaw
)])
.
then
(([
cplusRaw
,
cplus
])
=>
Promise
.
all
([
timestampToKey
(
cplus
.
time
),
cplusRawIrCID
(
cplus
,
cplusRaw
)]))
)
fetchRawCplus
(
hit
.
_id
).
then
((
cplusRaw
)
=>
appendFile
(
filename
,
cplusRaw
+
'
\n
'
,
()
=>
{}))
}
// console.log(ids)
imported
+=
batch
.
took
totalimported
+=
batch
.
took
if
(
imported
>
100
)
{
console
.
log
(
`
${
totalimported
.
toString
().
padStart
(
5
)}
/
${
total
}
imported
`
)
if
(
imported
>
100
0
)
{
console
.
log
(
`
${
totalimported
.
toString
().
padStart
(
5
)}
/
${
total
}
`
)
imported
=
0
await
Promise
.
all
(
cids
)
}
// take next batch
...
...
@@ -118,8 +112,51 @@ async function getAllCplusIr(): Promise<Array<[string, CID]>> {
scroll_id
=
batch
.
_scroll_id
// console.log(scroll_id)
}
}
return
Promise
.
all
(
cids
).
then
((
l
)
=>
l
.
sort
())
/// put all raw cplus profiles to ipfs node in index request and write result to a file
async
function
putIrToIPFS
()
{
const
LIMIT
=
500
// max number of lines to process simultaneously
const
NOTIF
=
2000
// log every N lines processed
const
input
=
'
./input/cplusimport.jsonl
'
const
output
=
'
./input/cplusIR.txt
'
const
rejected
=
'
./input/cplusHS.txt
'
let
queueSize
=
0
let
read
=
0
function
process
(
line
:
string
)
{
queueSize
++
if
(
queueSize
>
LIMIT
)
{
linereader
.
pause
()
}
try
{
const
cplus
=
JSON
.
parse
(
line
)
cplusRawIrCID
(
cplus
,
line
)
.
then
((
cid
)
=>
timestampToKey
(
cplus
.
time
)
+
'
'
+
cid
.
toString
()
+
'
\n
'
)
.
then
((
l
)
=>
appendFile
(
output
,
l
,
()
=>
{
read
++
queueSize
--
if
(
queueSize
<
LIMIT
)
{
linereader
.
resume
()
}
if
(
read
%
NOTIF
==
0
)
{
console
.
log
(
`processed
${
read
}
lines`
)
}
})
).
catch
(
e
=>
{
console
.
log
(
e
);
appendFile
(
rejected
,
line
,
()
=>
{
read
++
})
})
}
catch
(
e
)
{
appendFile
(
rejected
,
line
+
'
\n\n\n
'
,
()
=>
{})
}
}
const
linereader
=
createInterface
(
createReadStream
(
input
))
linereader
.
on
(
'
line
'
,
process
)
linereader
.
on
(
'
close
'
,
()
=>
console
.
log
(
'
done
'
))
}
async
function
importAllCplusToAMT
()
{
...
...
@@ -131,4 +168,6 @@ async function importAllCplusToAMT() {
// TODO use command line args to choose what to do
// doImport()
// doAllCplusCidsToAMT()
importAllCplusToAMT
()
// importAllCplusToAMT()
// getAllCplusIr()
putIrToIPFS
()
This diff is collapsed.
Click to expand it.
Preview
0%
Loading
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Save comment
Cancel
Please
register
or
sign in
to comment