Please use this identifier to cite or link to this item: https://gnanaganga.inflibnet.ac.in:8443/jspui/handle/123456789/2592
Full metadata record
DC FieldValueLanguage
dc.contributor.authorMidya, Abhisek-
dc.contributor.authorThomas, D G-
dc.contributor.authorPani, Alok Kumar-
dc.contributor.authorMalik, Saleem-
dc.contributor.authorBhatnagar, Shaleen-
dc.date.accessioned2023-12-19T05:08:57Z-
dc.date.available2023-12-19T05:08:57Z-
dc.date.issued2017-
dc.identifier.citationVol. 10256 LNCS; pp. 156-169en_US
dc.identifier.isbn9783319591070-
dc.identifier.isbn9783319591087-
dc.identifier.issn0302-9743-
dc.identifier.issn1611-3349-
dc.identifier.urihttps://doi.org/10.1007/978-3-319-59108-7_13-
dc.identifier.urihttp://gnanaganga.inflibnet.ac.in:8080/jspui/handle/123456789/2592-
dc.description.abstractIn [2,16] a new method of description of pictures of digitized rectangular arrays is introduced based on contextual grammars, called parallel internal contextual array grammars. In this paper, we pay our attention on parallel internal column contextual array grammars and observe that the languages generated by these grammars are not inferable from positive data only. We define two subclasses of parallel internal column contextual array languages, namely, k-uniform and strictly parallel internal column contextual languages which are incomparable and not disjoint classes and provide identification algorithms to learn these classes. © Springer International Publishing AG 2017.en_US
dc.language.isoenen_US
dc.publisherCombinatorial Image Analysis: 18th International Workshop, IWCIA 2017en_US
dc.subjectIdentification in the limit from positive dataen_US
dc.subjectK-uniformen_US
dc.subjectParallel internal column contextual array grammarsen_US
dc.titlePolynomial Time Algorithm For Inferring Subclasses of Parallel Internal Column Contextual Array Languagesen_US
dc.typeArticleen_US
Appears in Collections:Conference Papers

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.