hey today's agenda: 

 yayyy today's ai project session starts.

hmm today's tasks be checking dependency parser neural nets transformers to bind to sage wise definition. and then using wordnet to start to convert the nouns/verbs to group theoretic representations including also ontology data. 


hmmm lots of literals to translate to sage representation. only yet "go" literal is figured out how to represent. 


of course synonyms wont be retranslated unless semantically divergent. 


hmm I also need to install neo4j knowledge db to store the translated sentences in neo4j knowledge graphs. hmm so something like first neural net analyzing sentence of a paragraph along with other paragraph sentrences and then creating some dependency graph then conversion to algebraic representation happens to store the result in neo4j knowledge db graph format yepp.


so today would also in tandem check transformer neural nets and plus how to use dockerized neo4j db and then also in tandem would continue Ptb and nouns/verbs translations (representations of Ptbs and nouns/verbs with group theoretic representations) and I need to also make a docker node for prolog computer and store some knowledge db for some of Ptbs currently as prolog rulesets. e.g. since I can not fastly learn expert knowledge in modules or rings based geometry representations. Some of Ptbs require geometric definitions but that to be postponed when I finish those group theory books but now it would be rather such knowledge inferences/knowledge would be supported some knowledge db based prolog based inference engine run on docker or so. or even on same node that runs neural net to knowledge graph conversion task. or i might create API for neural net then also API for this prolog expert system based inference support system with prolog based defines such knowledge (for support to inference of Ptb semantics) and also API for writing of knowledge db to neoj storage. hmm then these can be also accomplished with another service that calls these APIs. 

hmm I think i would first run it on jupyter notebooks collectively in a single machine then later convert to API one by one those services etc.

yeyy then some weeks later, I can talk with ai and it could reason with algebraic inference mechanisms onset for 0.1 version. hmm lets make that be 0.1 version deadline to be 2 weeks later :) 


2 weeks later ai should be able to talk with me and understand. I think its kind of very strict milestone to set it 2 weeks since wordnet wouild need alot study. but i think in 2 weeks ai's first version which can talk /understand be ready in 2 weeks.

hmm meanwhile I need to intermittently also continue stuidy also group theoretic topics to become expert since that is required to study to that mathematical structure of quantum mechanics book completely..  and also have ai study to it whilst i study so that I could then instruct ai to write the particle phyics simulator yepp. so first task for coder ai is: writing the quantum mechanics/particle physics simulator with learning the quantum mechanics from both book's paragraphs and also assistance to define algebraically the formulations. 

initially the auto generative algebra investigations features wont be present ion 0.1 ai version but rather it is expected to not invent yet, but rather: e.g. understand paragraph and learn particle physics and write the particle physics /quantum mechanics simulator with both understanding paragraphs and both assistance by me. also I need to also create some algebraic knowledge db to code generation/translation mechanism to have it be converted to code. that particle physics simulator would be written like that. but, in overall, initial version as visible wont have full inference/automation capabilities, e.g. it can not by totally it self learn quantum mechanics nor write the simulator. but these features would be added in 0.2 0.3 versions and 1.0 version would be much less needing assistance to write simulators etc.

hmm lets be agile and not plan 0.2 version from now. lets finish this initial plan of 2 weeks for iterating the 0.1 version features. then later iterate one by one each feature in time that 1.0 version becomes very talented in learning and inventing. (e.g. inventing algebras etc etc or being able to read/understand interpret scientific articles etc and devising simulations for gathered knowledge or very fastly writing generating code etc or creating optimized designs for e.g. quantum algebra nor error correction codes of quantum algebra (future 1.0 version would be tasked with finding a very practical quantum computer design and also inventing its error correcting codes/algebra logic). ) lets not plan those 1.0 version iterations from now. lets first accomplish the functionality of 0.1 version and the deadline of the iteration is :  2 weeks later yepp. 




Yorumlar

Bu blogdaki popüler yayınlar