yayyy time to download many as possible neural net implementation of dependency graph parsers (for NLP) to check available opensource versions e.g. as many transformers or even non neural net implementations and start the main jupyter notebook and install also some neo4j to that machine where jupyter notebook runs . at the same time lets start writing the Ptb to sage code translation part. initially in py language since i would initially assemble logic in jupyter notebook but then later when turned to API calls those would be if necessary converted to some other language most possibly scala. or golang I dont know. but initially it would be all one notebook that truns and so its language is python. but of course big data tasks e.g. ontology db generation are scala code and written with zeppelin with scala. but the main logic would be with py language.  I mean for 0.1 version would be a jupyuter notebooks set. and even neo4j and prolog would be installed to the same jupyter notebook machine instead of any service / API logic etc.  hmm but sage might be initially dockerized and made api. since everytime asking sage via running docker container seems very unnice. Sage would be made API from start. Cause its easy to install neo4j but not any easy to install sage to any pc. I would rather utilize the Sage's ready docker image (Last time i tried to install sage i had lots of spent time with errors of missing libaries etc so that I think i would definitely again utilize sage commands from docker and would install it in to as either with EKS or a light pc running constantly docker and some microservice for query API etc.

 




 

Yorumlar

Bu blogdaki popüler yayınlar