-  yayyy this weekend's goal is: 


hmm I did alot laziness last week but also learnt about dependency constructs during some evenings. Then last night after stopped resting, did arranged initial possible sage based representations for many common dependency constructs. 

hmm today starts with continuing to studying neo4j db since might be main storage db mechanisms of read paragraphs which would hold sage definition sets alike idea I have had initially so I now resume neo4j study. 

hmm then I think I would utilize a neural net based dependency definitions provider and spacy based dependency definitions provider to then translate to sage wise definitions.

hmm maybe I have a basic talking ai version (hmm it would interpret and store knowledge in paragraphs it reads) this weekend. but not very intelligent yet. 

though talking part would need doing training with Bert alike transformers from this time sage wise definitionsets to language based representations so maybe that might not be possible to implement on sunday.   

Todays goal is to first learn neo4j and then start translating read dependency graphs to sage wise definition sets in neo4j definitions. 


yayy then next week I could continue group theoretic studies to implement other logic that would make enable ai understand even particle physics. yepp if i could finish this week and following week days some structure/solution designs then next week I might then refocus on continueing my group theoretic study to then build those knowledges /skills structure then it might even be as least intelligent as me. 

yayy then other task later would be teaching how to learn coding and then how to translate conceptual definition sets to coding and writing simulators. 

yayy then comes the task of writing particle physics simulator and then search of design of a quantum mechanics framework design and its various algebra modules. and then comes search of design of one possibly one most practical quantum computer design specs and its algebra. 

yepp recent following 2 months goals would be like these. 

this month is visibly goal of building ai in following 2 weeks. then gradually increasing thought skills e.g. enabling group theoretic skills and figuring out algebra designs/solutions itself capabilities etc  etc. 

so I think in 1.5 month it would be able to read particle physics alike books and understand.  I think this is an ambitious project plan. It might be later maybe in 3 months. I dont know. but If I finish project by november 1  then I definitely wont miss the milestone of the very initial project plan. 


This iteration mood is like both laziness and both studying. I mean its not like previous iteration many months ago which were like severe studying. I figured out I can also arrange project sprints that has laziness resting also. I do spare time also to laziness.  I think all project would be ready by November 1 yepp latest. with also lots of laziness times. 

It were hard to sustain constant studying type project sprints imho. not impossible to attend such sprint schedule but its harder imho. current iterations/sprints now also includes 50% laziness time beside 50% studying time alike time allocation.  maybe If I could I might increasing studying time slightly more I dont know I would  iterate every week with such every time allocation for that week. It might change per week. 


I am very curious to how geometry is implemented/represented in group theoretic constructs. but thats like at least 3 weeks or a month later studyings topic. that would also enable be base of how ai interprets geometry.

now some uninteresting neo4j time. its kind of uninteresting but useful/necessary framework to learn. not that interesting/curious to learn any neo4j (since its some usual software architecture having some methodology /pattern sets)  but its necessary layer to bind store knowledge of sage based definition sets. 

I am very uninterestedly now start to try to learn neo4j. (usual not very much software patterns. ayy boring to learn). 




 



Yorumlar

Bu blogdaki popüler yayınlar