Implemented some enhancement request and fixed some bugs

Fixed #2 : Variables are not recognized when inside a rule token
Fixed #15 : Rule: rete attributes are lost when a new ontology is created
Fixed #14 : ReteNetwork: Format rules must not be added to Rete network
Fixed #16 : DefConcept: Variables are not recognized when they are keyword arguments
Fixed #4 : Comparison are not correctly set when comparison property is a concept
Fixed #14 : Parser: merge FunctionParser.NamesNode and ExpressionParser.NamesNode
Fixed #18 : Parser: Add SourceCodeNode test to UnrecognizedNodeParser
Fixed #20 : At startup Number concept is saved in db a numerous number of time
Fixed #21 : CacheManager: I can remove all elements from a ListIfNeededCache and fill it again
Fixed #22 : CacheManager: I can remove all elements from a SetCache and fill it again
Fixed #23 : HistoryManager: history() no longer works
Fixed #24 : HistoryManager: history() no longer works after creating an exec rule
Fixed #25 : SheerkaMemory: Use MemoryObject instead of sheerka.local
Fixed #26 : Debugger: add the list all available services..
Fixed #27 : CONCEPTS_GRAMMARS_ENTRY does not seems to be in use any more
Fixed #28 : Give order to services
This commit is contained in:
2021-02-12 15:15:31 +01:00
parent 3a12ea58df
commit cac2dad17f
62 changed files with 1182 additions and 480 deletions
+4 -5
View File
@@ -5,7 +5,6 @@ from typing import List
from core import builtin_helpers
from core.builtin_concepts import BuiltinConcepts
from core.builtin_helpers import parse_function
from core.concept import Concept, DEFINITION_TYPE_BNF
from core.global_symbols import CONCEPT_COMPARISON_CONTEXT
from core.sheerka.services.SheerkaComparisonManager import SheerkaComparisonManager
@@ -833,10 +832,10 @@ class InFixToPostFix:
if self.unrecognized_tokens.parenthesis_count == 0:
self.unrecognized_tokens.fix_source()
res = parse_function(self.context,
self.unrecognized_tokens.source,
self.unrecognized_tokens.tokens[:],
self.unrecognized_tokens.start)
res = self.context.sheerka.parse_function(self.context,
self.unrecognized_tokens.source,
self.unrecognized_tokens.tokens[:],
self.unrecognized_tokens.start)
instances = get_n_clones(self, len(res))
self.forked.extend(instances[1:])