CINXE.COM

Untitled Document

<!DOCTYPE html> <html lang="en"> <head> <meta content="text/html; charset=utf-8" http-equiv="content-type"/> <title>Untitled Document</title> <!--Generated on Sun Jan 12 15:57:41 2025 by LaTeXML (version 0.8.8) http://dlmf.nist.gov/LaTeXML/.--> <meta content="width=device-width, initial-scale=1, shrink-to-fit=no" name="viewport"/> <link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/css/bootstrap.min.css" rel="stylesheet" type="text/css"/> <link href="/static/browse/0.3.4/css/ar5iv.0.7.9.min.css" rel="stylesheet" type="text/css"/> <link href="/static/browse/0.3.4/css/ar5iv-fonts.0.7.9.min.css" rel="stylesheet" type="text/css"/> <link href="/static/browse/0.3.4/css/latexml_styles.css" rel="stylesheet" type="text/css"/> <script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/js/bootstrap.bundle.min.js"></script> <script src="https://cdnjs.cloudflare.com/ajax/libs/html2canvas/1.3.3/html2canvas.min.js"></script> <script src="/static/browse/0.3.4/js/addons_new.js"></script> <script src="/static/browse/0.3.4/js/feedbackOverlay.js"></script> <base href="/html/2501.09029v1/"/></head> <body> <nav class="ltx_page_navbar"> </nav> <div class="ltx_page_main"> <div class="ltx_page_content"> <article class="ltx_document"> <div class="ltx_para" id="p1"> <span class="ltx_ERROR undefined" id="p1.1">\makesavenoteenv</span> <p class="ltx_p" id="p1.2">longtable <span class="ltx_ERROR undefined" id="p1.2.1">\setkeys</span>Ginwidth=<span class="ltx_ERROR undefined" id="p1.2.2">\Gin@nat@width</span>,height=<span class="ltx_ERROR undefined" id="p1.2.3">\Gin@nat@height</span>,keepaspectratio </p> </div> <div class="ltx_para" id="p2"> <blockquote class="ltx_quote" id="p2.1"> <p class="ltx_p" id="p2.1.1">EnhancingDataIntegritythroughProvenance TrackinginSemanticWebFrameworks</p> <p class="ltx_p" id="p2.1.2">NileshJain <br class="ltx_break"/><em class="ltx_emph ltx_font_italic" id="p2.1.2.1">technoNilesh@gmail.com (University of Mumbai)</em></p> </blockquote> </div> <div class="ltx_para" id="p3"> <p class="ltx_p" id="p3.1"><em class="ltx_emph ltx_font_bold ltx_font_italic" id="p3.1.1">Abstract</em><span class="ltx_text ltx_font_bold" id="p3.1.2">—This paper explores the integration of provenance tracking systems within the context of Semantic Web technologies to enhance data integrity in diverse operational environments. SURROUND Australia Pty Ltd demonstrates innovative applica-tions of the PROV Data Model (PROV-DM) and its Semantic Web variant, PROV-O, to systematically record and manage provenance information across multiple data processing domains. By employing RDF and Knowledge Graphs, SURROUND ad-dresses the critical challenges of shared entity identification and provenance granularity. The paper highlights the company’s architecture for capturing comprehensive provenance data, en-abling robust validation, traceability, and knowledge inference. Through the examination of two projects, we illustrate how provenance mechanisms not only improve data reliability but also facilitate seamless integration across heterogeneous systems. Our findings underscore the importance of sophisticated provenance solutions in maintaining data integrity, serving as a reference for industry peers and academics engaged in provenance research and implementation.</span></p> </div> <div class="ltx_para" id="p4"> <p class="ltx_p" id="p4.1">I. INTRODUCTION</p> </div> <div class="ltx_para" id="p5"> <p class="ltx_p" id="p5.1">Encompass Australia Pty Ltd (”Encompass”) is a little how-</p> </div> <div class="ltx_para" id="p6"> <p class="ltx_p" id="p6.1">ever unique innovation organization that has some expertise in</p> </div> <div class="ltx_para" id="p7"> <p class="ltx_p" id="p7.1">giving state of the art simulated intelligence and information</p> </div> <div class="ltx_para" id="p8"> <p class="ltx_p" id="p8.1">the executives items to both government and confidential area</p> </div> <div class="ltx_para" id="p9"> <p class="ltx_p" id="p9.1">markets. Established with the mission to change how associa-</p> </div> <div class="ltx_para" id="p10"> <p class="ltx_p" id="p10.1">tions make due, cycle, and influence information, Encompass</p> </div> <div class="ltx_para" id="p11"> <p class="ltx_p" id="p11.1">has quickly secured itself as a forerunner in the field by</p> </div> <div class="ltx_para" id="p12"> <p class="ltx_p" id="p12.1">offering special and high level arrangements. At the center of</p> </div> <div class="ltx_para" id="p13"> <p class="ltx_p" id="p13.1">Encompass’ contributions lies its refined utilization of Seman-</p> </div> <div class="ltx_para" id="p14"> <p class="ltx_p" id="p14.1">tic Web information, an innovative methodology that separates</p> </div> <div class="ltx_para" id="p15"> <p class="ltx_p" id="p15.1">the organization from its rivals. Encompass solidly accepts</p> </div> <div class="ltx_para" id="p16"> <p class="ltx_p" id="p16.1">that the Semantic Web is the best method for safeguarding</p> </div> <div class="ltx_para" id="p17"> <p class="ltx_p" id="p17.1">significance after some time, empowering frameworks and</p> </div> <div class="ltx_para" id="p18"> <p class="ltx_p" id="p18.1">hierarchical changes without the deficiency of basic setting.</p> </div> <div class="ltx_para" id="p19"> <p class="ltx_p" id="p19.1">This conviction is grounded in the strong capacities of the</p> </div> <div class="ltx_para" id="p20"> <p class="ltx_p" id="p20.1">Semantic Web, which consider a more significant level of</p> </div> <div class="ltx_para" id="p21"> <p class="ltx_p" id="p21.1">adaptability, versatility, and versatility when contrasted with</p> </div> <div class="ltx_para" id="p22"> <p class="ltx_p" id="p22.1">customary information the executives strategies.</p> </div> <div class="ltx_para" id="p23"> <p class="ltx_p" id="p23.1">The offers for Encompass’ clients are clear and convincing,</p> </div> <div class="ltx_para" id="p24"> <p class="ltx_p" id="p24.1">especially by they way they influence the Semantic Web</p> </div> <div class="ltx_para" id="p25"> <p class="ltx_p" id="p25.1">to address a large number of mind boggling information</p> </div> <div class="ltx_para" id="p26"> <p class="ltx_p" id="p26.1">challenges. These incentives include:</p> </div> <div class="ltx_para" id="p27"> <blockquote class="ltx_quote" id="p27.1"> <p class="ltx_p" id="p27.1.1"><em class="ltx_emph ltx_font_italic" id="p27.1.1.1">•</em> <span class="ltx_text ltx_font_bold" id="p27.1.1.2">Expressivity</span> <span class="ltx_text ltx_font_bold" id="p27.1.1.3">and</span> <span class="ltx_text ltx_font_bold" id="p27.1.1.4">Complexity:</span> The expressivity of RDFS1and OWL22empowers the making of endlessly</p> <p class="ltx_p" id="p27.1.2">complex yet strong information models. These systems</p> <p class="ltx_p" id="p27.1.3">1https://www.w3.org/TR/rdf-diagram/ <br class="ltx_break"/>2https://www.w3.org/TR/owl2-outline/</p> <p class="ltx_p" id="p27.1.4">give an establishment to addressing modern information structures that can develop and adjust to the changing requirements of an association.</p> <p class="ltx_p" id="p27.1.5"><em class="ltx_emph ltx_font_italic" id="p27.1.5.1">•</em> <span class="ltx_text ltx_font_bold" id="p27.1.5.2">Reuse of Existing Models:</span> The Semantic Web considers the direct reuse of many existing, profoundly complex, and distributed models (ontologies). This essentially de-creases the work expected to assemble new informa-tion models without any preparation, while additionally guaranteeing that industry best practices and laid out information are integrated into the framework plan.</p> <p class="ltx_p" id="p27.1.6"><em class="ltx_emph ltx_font_italic" id="p27.1.6.1">•</em> <span class="ltx_text ltx_font_bold" id="p27.1.6.2">Extensibility:</span> The RDF chart based information struc-tures utilized by Encompass are intrinsically extensible. As frameworks develop and advance, there is compelling reason need to change the fundamental outline, making it simpler to oblige new necessities without upsetting the current foundation.</p> <p class="ltx_p" id="p27.1.7"><em class="ltx_emph ltx_font_italic" id="p27.1.7.1">•</em> <span class="ltx_text ltx_font_bold" id="p27.1.7.2">System Independence:</span> Semantic Web information de-signs are framework free, empowering consistent in the engine framework changes without influencing the uprightness of the information or the applications that depend on it. This makes Encompass’ answers especially alluring to associations that expect changes in their IT framework over the long haul.</p> <p class="ltx_p" id="p27.1.8"><em class="ltx_emph ltx_font_italic" id="p27.1.8.1">•</em> <span class="ltx_text ltx_font_bold" id="p27.1.8.2">Bridging Siloed Applications:</span> By executing a Seman-tic Web layer, Encompass empowers different interior applications, which are frequently siloed and separated from each other, to flawlessly share information. This establishes a more bound together and cooperative cli-mate inside associations, where frameworks that were beforehand inconsistent can now speak effortlessly.</p> <p class="ltx_p" id="p27.1.9"><em class="ltx_emph ltx_font_italic" id="p27.1.9.1">•</em> <span class="ltx_text ltx_font_bold" id="p27.1.9.2">Cross-Hierarchical Information Sharing:</span> With the uti-lization of Semantic Web advances, Encompass gives the capacity to share information across authoritative limits without the requirement for unique between hierarchical information contracts. This is made conceivable by the Semantic demonstrating of all information components, which guarantees that information can be perceived and utilized by outside parties without requiring custom in-corporations.</p> <p class="ltx_p" id="p27.1.10"><em class="ltx_emph ltx_font_italic" id="p27.1.10.1">•</em> <span class="ltx_text ltx_font_bold" id="p27.1.10.2">Data Validation:</span> The cutting edge limitation dialects, like SHACL [1], offer strong information approval abil-ities. These dialects empower associations to uphold severe information quality principles, guaranteeing that the information utilized across different frameworks is precise, steady, and consistent with important rules.</p> <p class="ltx_p" id="p27.1.11"><em class="ltx_emph ltx_font_italic" id="p27.1.11.1">•</em> <span class="ltx_text ltx_font_bold" id="p27.1.11.2">Advanced Thinking Capabilities:</span> The high level think-</p> <p class="ltx_p" id="p27.1.12">ing capacities of OWL and SHACL permit Encompass’frameworks to construe new information from existing in-formation. This is especially helpful for applications that require dynamic in view of mind boggling, interconnected information, as it empowers the framework to determine new bits of knowledge and make expectations that could never have been obvious through conventional strategies.</p> </blockquote> </div> <div class="ltx_para" id="p28"> <p class="ltx_p" id="p28.1">All a vital rising advantage of these capacities is the capacity to give complete provenance data across Encompass’ various frameworks. Provenance alludes to the set of experiences or ancestry of information, including its starting points, changes, and how it has been utilized over the long haul. This is a basic part of information the executives, especially in situations where information precision, recognizability, and responsibil-ity are central. With provenance data implanted in each part of the framework, associations can follow and comprehend the development of their information, guaranteeing that it tends to be relied upon and that choices made in light of it are very much educated.</p> </div> <div class="ltx_para" id="p29"> <p class="ltx_p" id="p29.1">In this paper, we don’t present new exploration claims, as this is a <em class="ltx_emph ltx_font_italic" id="p29.1.1">Applications Track</em> paper. All things considered, we center around the creative utilization of provenance in functional frameworks and the sending of provenance-based arrangements that exhibit an experienced way to deal with the utilization of provenance for true undertakings. Our work ex-pects to give industry peers experiences into how provenance is being applied inside the setting of Semantic Web advances and information the board frameworks. Furthermore, this paper tries to illuminate scholastics and analysts who are keen on understanding the present status of provenance research as it connects with viable execution. By exhibiting this present real-ity use of provenance, we desire to give important contribution to the continuous appraisal of provenance examination’s effect and its future bearings.</p> </div> <div class="ltx_para" id="p30"> <p class="ltx_p" id="p30.1">The construction of the paper is as per the following: we will initially give an outline of Encompass’ extensive provenance frameworks and their incorporation into our items and administrations. We will then examine two explicit tasks that have used these provenance frameworks, showing how they have been applied practically speaking. In doing as such, we will feature the purposes for our decision of specific PROV-related executions and portray how these decisions line up with our business objectives and specialized necessities. At long last, we will investigate regions where we accept provenance principles could be improved to more readily address the issues of associations like Encompass and its clients. Through this conversation, we expect to add to the continuous discourse on provenance research and its reasonable applications in the field of information the executives and simulated intelligence.</p> </div> <div class="ltx_para" id="p31"> <blockquote class="ltx_quote" id="p31.1"> <p class="ltx_p" id="p31.1.1">II. SIMPLE PROVENANCE HYPOTHESIS, COMPLEX PRACTICE</p> </blockquote> </div> <div class="ltx_para" id="p32"> <p class="ltx_p" id="p32.1">The extraction of helpful and noteworthy data from hetero-geneous or enormous scope information settings is a basic test in current information handling. This extraction can be acted in different ways relying upon the idea of the information and the</p> </div> <div class="ltx_para" id="p33"> <blockquote class="ltx_quote" id="p33.1"> <p class="ltx_p" id="p33.1.1">setting where it is being utilized. In situations where a portion of the information has a known construction, conventional questions can be utilized to choose significant subsets of the information. The most straightforward type of this is text-based looking, which includes looking against printed happy with fluctuating levels of complexity, contingent upon the idea of the inquiry question and the hidden information. Further developed procedures might include the utilization of factual techniques to recognize designs inside the information, empowering the extraction of valuable data from apparently unstructured or semi-organized datasets.</p> <p class="ltx_p" id="p33.1.2">Encompass has embraced AI (ML) ways to deal with work with the disclosure and connection of data from huge, complex datasets. Via preparing frameworks to perceive and gather designs, Encompass’ ML frameworks can reveal stowed away connections and experiences inside information that would somehow be challenging to distinguish utilizing customary procedures. The utilization of ML upgrades the ability to handle information and works on the general execution of the framework. Close by these ML strategies, Encompass utilizes Semantic or Information Diagram (KG)- based context ori-ented data to give extra layers of importance and significance, working on the precision and profundity of data recovery. These information diagrams are especially important for grasp-ing the connections between various elements and can be utilized to direct the translation of information, guaranteeing that setting is safeguarded even as information develops over the long haul.</p> <p class="ltx_p" id="p33.1.3">At times, the deduction of design from unstructured or semi-organized information is likewise worked with through ML methods. These methodologies empower Encompass to make significant, organized portrayals from crude or boisterous information, making it more straightforward to apply thinking and perform computerized investigation. To additionally refine the nature of the information and the models utilized, Encom-pass consolidates Human-in the know (HITL) techniques into its tasks. These HITL exercises include human oversight to survey, refine, and work on the preparation of the ML frame-works, guaranteeing that they can be consistently refreshed and adjusted to reflect changing prerequisites and further develop exactness. HITL strategies are especially powerful in situations where computerized frameworks are not adequate all alone, and where human ability is expected to direct the educational experience.</p> <p class="ltx_p" id="p33.1.4">The execution of these frameworks requires perplexing, crossover models that join thinking, semantic web innova-tions, and AI to sort out and recover data productively from enormous scope projects. These frameworks should have the option to deal with information across different organizations and cycles while guaranteeing that the provenance of all information is kept in a deliberate and steady way. Provenance, in this unique situation, alludes to the set of experiences or ancestry of information, including where it came from, the way things were handled, and the way in which it has been utilized over the long run. Provenance is a critical part of information the executives, especially in applications that require elevated</p> </blockquote> </div> <div class="ltx_para" id="p34"> <p class="ltx_p" id="p34.1">degrees of trust, responsibility, and discernibility.</p> </div> <div class="ltx_para" id="p35"> <p class="ltx_p" id="p35.1">To record provenance methodicallly across numerous frame-works and different information handling spaces, it is funda-mental to have a clear cut and cognizant provenance refer-ence model, as well as a strong specialized foundation for interpreting, conveying, and incorporating information across frameworks. The presentation and inescapable reception of the PROV Information Model (PROV-DM) [2] and its Semantic Web partner, PROV-O [3], has furnished Encompass with an adaptable and exhaustive provenance structure that can be applied across many situations. The PROV model has demonstrated to be adequately adaptable for our necessities, with just minor augmentations expected to fit it to the par-ticular prerequisites of our different frameworks. The model is likewise sufficiently strong to help efficient use across our ventures, guaranteeing consistency and interoperability.</p> </div> <div class="ltx_para" id="p36"> <p class="ltx_p" id="p36.1">In any case, the specialized execution of provenance fol-lowing and joining isn’t without its difficulties. Two essential difficulties that we face in our work are:</p> </div> <div class="ltx_para" id="p37"> <blockquote class="ltx_quote" id="p37.1"> <p class="ltx_p" id="p37.1.1">1) <span class="ltx_text ltx_font_bold" id="p37.1.1.1">Shared Element Identification:</span> A basic test in multi-framework information handling is guaranteeing that substances, like individuals, reports, or different items, are accurately distinguished across various frameworks.</p> <p class="ltx_p" id="p37.1.2">As information moves between various frameworks and is handled in different ways, it is fundamental to keep up with steady distinguishing proof of these substances to protect the honesty of the provenance data. This common element distinguishing proof guarantees that the provenance records are exact and mirror the right connections between the substances and their changes. 2) <span class="ltx_text ltx_font_bold" id="p37.1.2.1">Granularity:</span> Provenance data should be recorded at a proper degree of detail to catch the vital experiences without turning out to be excessively complicated or hard to make due. The granularity of provenance alludes to the degree of detail remembered for provenance records, and it is crucial for work out some kind of harmony between catching adequate detail to help trust and responsibility, while staying away from extreme intricacy that could overpower clients or dial back the framework. Furthermore, there should be components for totaling provenance data at more elevated levels for framework or cycle level outlines.</p> </blockquote> </div> <div class="ltx_para" id="p38"> <p class="ltx_p" id="p38.1">The principal challenge, shared element recognizable proof, is tended to using the Asset Portrayal Structure (RDF), which utilizes special Uniform Asset Identifiers (URIs) to address substances and different articles. RDF’s Open World Presump-tion (OWA) takes into consideration information portrayal across independent RDF datasets, empowering various frame-works to reference shared URIs and combine them. Encompass use RDF for provenance following as well as the essential information design for a large portion of its undertakings. By addressing project information and its related provenance in RDF, we can guarantee that elements are recognized reliably across numerous datasets, working with the coordination of data from various sources. This approach is likewise reached</p> </div> <div class="ltx_para" id="p39"> <blockquote class="ltx_quote" id="p39.1"> <p class="ltx_p" id="p39.1.1">out to non-RDF data sources, for example, Git-based program-ming and information variant control, where URIs are utilized to distinguish and reference substances.</p> <p class="ltx_p" id="p39.1.2">To guarantee that provenance data is reliably coordinated across different subsystems, we have fostered a bunch of rules and practices that empower the consistent recognizable proof and following of items. These rules include: <br class="ltx_break"/>1) Item characters should be laid out in Information Dia- grams (KGs) and got to through APIs.</p> <p class="ltx_p" id="p39.1.3">2) Item character should be overseen inside the Informa-tion Diagrams at whatever point HITL connections are required.</p> <p class="ltx_p" id="p39.1.4">3) Handling subsystems should save and report standard article personalities to guarantee consistency across var-ious phases of the information lifecycle.</p> <p class="ltx_p" id="p39.1.5">4) Reasonable arrangements of items ought to be overseen in specific determination frameworks, like Git, as long as the datasets containing them are portrayed inside the Information Diagrams (see dataset granularity).</p> <p class="ltx_p" id="p39.1.6">5) All handling reports and results should incorporate provenance data, utilizing the sanctioned PROV-DM model.</p> <p class="ltx_p" id="p39.1.7">6) Handling components should be rationally distinguished inside Information Diagrams to guarantee that all parts of the framework are precisely followed.</p> <p class="ltx_p" id="p39.1.8">By keeping these rules, we have had the option to foster a strong framework for coordinating provenance data across different subsystems and guaranteeing that it is reliably fol-lowed and detailed all through the information lifecycle. These frameworks and techniques are portrayed in more detail in the accompanying segment.</p> </blockquote> </div> <div class="ltx_para" id="p40"> <p class="ltx_p" id="p40.1">To delineate the adequacy of our provenance following methodology, we give Figure 1, which shows the UI of the <em class="ltx_emph ltx_font_italic" id="p40.1.1">SURROUND Metaphysics Platform</em> (SOP). This point of interaction shows provenance data inside a Sankey chart, permitting clients to envision the progression of information and its related provenance across various handling steps. The provenance information showed in the chart is created by the PROV-DM/PROV-O model by Encompass’ handling work process device, <em class="ltx_emph ltx_font_italic" id="p40.1.2">ProvWF</em>, which performs Named Sub-stance Acknowledgment (NER) against electronic records and matches elements against a portion of Encompass’ Information Diagram items. As well as envisioning provenance data, SOP additionally oversees it in packs as <em class="ltx_emph ltx_font_italic" id="p40.1.3">Managed Graphs</em>, which are treated as semantic resources. These Oversaw Charts are consequently connected with provenance data, including possession and access control, guaranteeing that information is appropriately overseen and followed all through its lifecycle. The issue of provenance granularity is tended to by ana-lyzing the various sorts of handling that normally happen in heterogeneous frameworks. Encompass plays out a for every situation evaluation of the necessary provenance granularity for each venture, guaranteeing that the degree of detail caught is fitting for the particular necessities of the task. Table I gives a rundown of handling capabilities, instances of these capabil-ities, and the necessary granularity for every situation. All by</p> </div> <div class="ltx_para" id="p41"> <p class="ltx_p" id="p41.1">playing out these evaluations and using existing apparatuses to create and store provenance, Encompass can guarantee that the right degree of provenance data is caught and kept up with for its ventures.</p> </div> <div class="ltx_para" id="p42"> <blockquote class="ltx_quote" id="p42.1"> <p class="ltx_p" id="p42.1.1">III. COMPANY-WIDE PROVENANCE ARCHITECTURE</p> </blockquote> </div> <div class="ltx_para" id="p43"> <p class="ltx_p" id="p43.1">To productively oversee and follow the provenance of different resources inside our IT projects, Encompass has executed an exhaustive, broad design for recording and us-ing provenance data. This framework guarantees that all information and cycles are detectable, irrefutable, and can be reliably connected to their starting points, changes, and results. Provenance following is urgent for keeping up with straightforwardness as well as for working on the general proficiency and trustworthiness of our information handling pipelines.</p> </div> <div class="ltx_para" id="p44"> <p class="ltx_p" id="p44.1">The provenance engineering we use is based on a blend of devoted instruments and universally useful frameworks, intended to catch and store provenance information in an or-ganized and normalized way. The center of this engineering is the utilization of PROV-O, a broadly embraced cosmology for demonstrating provenance in the Semantic Web. Underneath, we depict the significant parts of this design and the jobs they play in supporting our different information the executives needs.</p> </div> <div class="ltx_para" id="p45"> <p class="ltx_p" id="p45.1"><em class="ltx_emph ltx_font_italic" id="p45.1.1">A. Provenance Tools</em></p> </div> <div class="ltx_para" id="p46"> <p class="ltx_p" id="p46.1">Our framework is fundamentally based on the accompany-ing significant devices, each filling a particular need in the provenance following cycle:</p> </div> <div class="ltx_para" id="p47"> <blockquote class="ltx_quote" id="p47.1"> <p class="ltx_p" id="p47.1.1"><em class="ltx_emph ltx_font_italic" id="p47.1.1.1">•</em> <span class="ltx_text ltx_font_bold" id="p47.1.1.2">SURROUND Metaphysics Stage (SOP)</span> The Encom-pass Metaphysics Stage (SOP) is a key endeavor level information the executives framework based on seman-tic innovations. SOP depends on Top Quadrant’s <em class="ltx_emph ltx_font_italic" id="p47.1.1.3">EDG</em> (Undertaking Information Administration) system, which gives a vigorous establishment to overseeing information resources and administration strategies. SOP broadens this structure by integrating the administration of seman-tic resource states and assortments, empowering an addi-tional adaptable and extensive information the executives climate. SOP assumes a crucial part in recording PROV-DM-consistent provenance for all activities including semantic resources. This incorporates recording activi-ties performed on semantic information resources, like changes, updates, and changes, as well as the connections between these resources. The provenance data put away in SOP is basic for figuring out the progression of information across frameworks and for guaranteeing that all information changes are straightforward and recog-nizable. Furthermore, SOP’s combination with different devices in the biological system considers consistent exchange and representation of provenance information, adding to a bound together way to deal with informa-tion administration across the association. SOP records and coordinates provenance data created by different frameworks and work processes, giving a comprehensive</p> <p class="ltx_p" id="p47.1.2">perspective on the information lifecycle. For more data about SOP, visit https://surroundaustralia.com/sop.</p> <p class="ltx_p" id="p47.1.3"><em class="ltx_emph ltx_font_italic" id="p47.1.3.1">•</em> <span class="ltx_text ltx_font_bold" id="p47.1.3.2">ProvWorkflow (ProvWF)</span> ProvWorkflow (ProvWF) is a Python-based structure intended to work with the making of work processes for different information handling un-dertakings. These work processes, once executed, record PROV-DM provenance information for the activities per-formed during the work process execution, as well as the information that is consumed and created. ProvWF is upheld by Encompass’ <em class="ltx_emph ltx_font_italic" id="p47.1.3.3">Block Library</em>, which gives reusable capability impedes that can be coordinated into work processes. These blocks cover a great many errands, for example, Information Diagram (KG) Programming interface demands, Regular Language Handling (NLP) for text examination, and different information handling exercises. ProvWF’s measured plan empowers the simple sythesis of intricate work processes from straightforward, reusable structure blocks. The provenance information produced by ProvWF is moved to SOP as provenance packs, guaranteeing that all activities inside work pro-cesses are completely discernible inside the more exten-sive information the executives framework. This joining empowers start to finish following of information and cycle changes across various phases of the work process. ProvWF likewise gives adaptability to follow provenance at different degrees of granularity, contingent upon the necessities of the venture. For more data about ProvWF, visit https://surroundaustralia.com/provwf.</p> <p class="ltx_p" id="p47.1.4"><em class="ltx_emph ltx_font_italic" id="p47.1.4.1">•</em> <span class="ltx_text ltx_font_bold" id="p47.1.4.2">Block Library</span> The Block Library is a fundamental piece of the ProvWF biological system, containing an inventory of predefined <em class="ltx_emph ltx_font_italic" id="p47.1.4.3">Blocks</em>, which are basically PROV-DM Activity class objects. These blocks address reusable capabilities or activities that can be integrated into work processes to perform normal errands, for example, Pro-gramming interface cooperations, text handling, and in-formation investigation. By keeping an extensive library of blocks, Encompass guarantees that work processes can be fabricated all the more proficiently, with normal tasks preoccupied away into reusable parts. The Block Library improves on work process creation, decreases overt repet-itiveness, and guarantees consistency in the execution of normal assignments. Each block is related with its own arrangement of provenance information, which is followed and coordinated into the more extensive PROV-O system.</p> <p class="ltx_p" id="p47.1.5"><em class="ltx_emph ltx_font_italic" id="p47.1.5.1">•</em> <span class="ltx_text ltx_font_bold" id="p47.1.5.2">Git</span> Git, the disseminated rendition control framework, is utilized broadly inside our association to deal with the forming of resources like code, information, and documentation. It permits us to follow the progressions made to resources over the long haul and guarantees that every adaptation is appropriately recorded and recogniz-able. While Git itself doesn’t locally uphold PROV-O provenance, we utilize it by recording URIs for elements oversaw inside Git vaults and referring to them in our PROV-O information. The utilization of URIs guarantees that we can reliably follow substances across both Git</p> </blockquote> </div> <div class="ltx_para" id="p48"> <img alt="[Uncaptioned image]" class="ltx_graphics ltx_img_landscape" height="354" id="p48.g1" src="extracted/6126849/vertopal_a3830e0462ac42ceaa38505a019a9e40/media/image1.png" width="712"/> </div> <div class="ltx_para" id="p49"> <p class="ltx_p" id="p49.1">Fig. 1. An example of a provenance trace from a processing workflow that uses elements of a knowledge graph, performs processing in cloud-hosted scalable services, generates augmented views of an input stream (performing Named Entity Recognition on a document set and annotating with elements from the knowledge graph), persists the results in the knowledge graph and integrates the provenance trace with the provenance trace generated by knowledge graph management.</p> </div> <figure class="ltx_table" id="S0.T1"> <table class="ltx_tabular" id="S0.T1.1"> <thead class="ltx_thead"> <tr class="ltx_tr" id="S0.T1.1.1.1"> <th class="ltx_td ltx_align_justify ltx_align_top ltx_th ltx_th_column ltx_border_tt" id="S0.T1.1.1.1.1"> <span class="ltx_inline-block ltx_align_top" id="S0.T1.1.1.1.1.1"> <span class="ltx_p" id="S0.T1.1.1.1.1.1.1" style="width:0.0pt;">() <span class="ltx_inline-block ltx_align_left ltx_minipage ltx_align_bottom" id="S0.T1.1.1.1.1.1.1.1" style="width:433.6pt;"> <span class="ltx_quote" id="S0.T1.1.1.1.1.1.1.1.1"> <span class="ltx_p" id="S0.T1.1.1.1.1.1.1.1.1.1"><span class="ltx_text ltx_font_bold" id="S0.T1.1.1.1.1.1.1.1.1.1.1">Function</span></span> </span> </span></span> </span> </th> <th class="ltx_td ltx_align_justify ltx_align_top ltx_th ltx_th_column ltx_border_tt" id="S0.T1.1.1.1.2"> <span class="ltx_inline-block ltx_align_top" id="S0.T1.1.1.1.2.1"> <span class="ltx_p" id="S0.T1.1.1.1.2.1.1" style="width:0.0pt;"> <span class="ltx_inline-block ltx_minipage ltx_align_bottom" id="S0.T1.1.1.1.2.1.1.1" style="width:433.6pt;"> <span class="ltx_quote" id="S0.T1.1.1.1.2.1.1.1.1"> <span class="ltx_p" id="S0.T1.1.1.1.2.1.1.1.1.1"><span class="ltx_text ltx_font_bold" id="S0.T1.1.1.1.2.1.1.1.1.1.1">Examples</span></span> </span> </span></span> </span> </th> <th class="ltx_td ltx_nopad_r ltx_align_justify ltx_align_top ltx_th ltx_th_column ltx_border_tt" id="S0.T1.1.1.1.3"> <span class="ltx_inline-block ltx_align_top" id="S0.T1.1.1.1.3.1"> <span class="ltx_p" id="S0.T1.1.1.1.3.1.1" style="width:0.0pt;"> <span class="ltx_inline-block ltx_minipage ltx_align_bottom" id="S0.T1.1.1.1.3.1.1.1" style="width:433.6pt;"> <span class="ltx_quote" id="S0.T1.1.1.1.3.1.1.1.1"> <span class="ltx_p" id="S0.T1.1.1.1.3.1.1.1.1.1"><span class="ltx_text ltx_font_bold" id="S0.T1.1.1.1.3.1.1.1.1.1.1">Granularity</span></span> </span> </span></span> </span> </th> </tr> <tr class="ltx_tr" id="S0.T1.1.2.2"> <th class="ltx_td ltx_align_justify ltx_align_top ltx_th ltx_th_column ltx_border_t" id="S0.T1.1.2.2.1"> <span class="ltx_inline-block ltx_align_top" id="S0.T1.1.2.2.1.1"> <span class="ltx_p" id="S0.T1.1.2.2.1.1.1" style="width:0.0pt;">()</span> </span> </th> <th class="ltx_td ltx_align_justify ltx_align_top ltx_th ltx_th_column ltx_border_t" id="S0.T1.1.2.2.2"></th> <th class="ltx_td ltx_align_justify ltx_align_top ltx_th ltx_th_column ltx_border_t" id="S0.T1.1.2.2.3"></th> </tr> </thead> <tbody class="ltx_tbody"> <tr class="ltx_tr" id="S0.T1.1.3.1"> <td class="ltx_td ltx_align_justify ltx_align_top" id="S0.T1.1.3.1.1"> <span class="ltx_inline-block ltx_align_top" id="S0.T1.1.3.1.1.1"> <span class="ltx_p" id="S0.T1.1.3.1.1.1.1" style="width:0.0pt;"> <span class="ltx_inline-block ltx_minipage ltx_align_top" id="S0.T1.1.3.1.1.1.1.1" style="width:433.6pt;"> <span class="ltx_quote" id="S0.T1.1.3.1.1.1.1.1.1"> <span class="ltx_p" id="S0.T1.1.3.1.1.1.1.1.1.1">Human-in-the-loop ML classification</span> </span> </span></span> </span> </td> <td class="ltx_td ltx_align_justify ltx_align_top" id="S0.T1.1.3.1.2"> <span class="ltx_inline-block ltx_align_top" id="S0.T1.1.3.1.2.1"> <span class="ltx_p" id="S0.T1.1.3.1.2.1.1" style="width:0.0pt;"> <span class="ltx_inline-block ltx_minipage ltx_align_top" id="S0.T1.1.3.1.2.1.1.1" style="width:433.6pt;"> <span class="ltx_quote" id="S0.T1.1.3.1.2.1.1.1.1"> <span class="ltx_p" id="S0.T1.1.3.1.2.1.1.1.1.1">Establishment of defs, Registration entities, Annotation, Classification for training</span> </span> </span></span> </span> </td> <td class="ltx_td ltx_nopad_r ltx_align_justify ltx_align_top" id="S0.T1.1.3.1.3"> <span class="ltx_inline-block ltx_align_top" id="S0.T1.1.3.1.3.1"> <span class="ltx_p" id="S0.T1.1.3.1.3.1.1" style="width:0.0pt;"> <span class="ltx_inline-block ltx_minipage ltx_align_top" id="S0.T1.1.3.1.3.1.1.1" style="width:433.6pt;"> <span class="ltx_quote" id="S0.T1.1.3.1.3.1.1.1.1"> <span class="ltx_p" id="S0.T1.1.3.1.3.1.1.1.1.1">Statement, Reified statements</span> </span> </span></span> </span> </td> </tr> <tr class="ltx_tr" id="S0.T1.1.4.2"> <td class="ltx_td ltx_align_justify ltx_align_top" id="S0.T1.1.4.2.1"> <span class="ltx_inline-block ltx_align_top" id="S0.T1.1.4.2.1.1"> <span class="ltx_p" id="S0.T1.1.4.2.1.1.1" style="width:0.0pt;">Database management, Data transformation</span> </span> </td> <td class="ltx_td ltx_align_justify ltx_align_top" id="S0.T1.1.4.2.2"> <span class="ltx_inline-block ltx_align_top" id="S0.T1.1.4.2.2.1"> <span class="ltx_p" id="S0.T1.1.4.2.2.1.1" style="width:0.0pt;"> <span class="ltx_inline-block ltx_minipage ltx_align_top" id="S0.T1.1.4.2.2.1.1.1" style="width:433.6pt;"> <span class="ltx_quote" id="S0.T1.1.4.2.2.1.1.1.1"> <span class="ltx_p" id="S0.T1.1.4.2.2.1.1.1.1.1">Making data instances sets available in a useful form</span> </span> </span></span> </span> </td> <td class="ltx_td ltx_nopad_r ltx_align_justify ltx_align_top" id="S0.T1.1.4.2.3"> <span class="ltx_inline-block ltx_align_top" id="S0.T1.1.4.2.3.1"> <span class="ltx_p" id="S0.T1.1.4.2.3.1.1" style="width:0.0pt;"> <span class="ltx_inline-block ltx_minipage ltx_align_top" id="S0.T1.1.4.2.3.1.1.1" style="width:433.6pt;"> <span class="ltx_quote" id="S0.T1.1.4.2.3.1.1.1.1"> <span class="ltx_p" id="S0.T1.1.4.2.3.1.1.1.1.1">Dataset (table, spreadsheet, graph etc)</span> </span> </span></span> </span> </td> </tr> <tr class="ltx_tr" id="S0.T1.1.5.3"> <td class="ltx_td ltx_align_justify ltx_align_top" id="S0.T1.1.5.3.1"> <span class="ltx_inline-block ltx_align_top" id="S0.T1.1.5.3.1.1"> <span class="ltx_p" id="S0.T1.1.5.3.1.1.1" style="width:0.0pt;"> <span class="ltx_inline-block ltx_minipage ltx_align_top" id="S0.T1.1.5.3.1.1.1.1" style="width:433.6pt;"> <span class="ltx_quote" id="S0.T1.1.5.3.1.1.1.1.1"> <span class="ltx_p" id="S0.T1.1.5.3.1.1.1.1.1.1">Query</span> </span> </span></span> </span> </td> <td class="ltx_td ltx_align_justify ltx_align_top" id="S0.T1.1.5.3.2"> <span class="ltx_inline-block ltx_align_top" id="S0.T1.1.5.3.2.1"> <span class="ltx_p" id="S0.T1.1.5.3.2.1.1" style="width:0.0pt;"> <span class="ltx_inline-block ltx_minipage ltx_align_top" id="S0.T1.1.5.3.2.1.1.1" style="width:433.6pt;"> <span class="ltx_quote" id="S0.T1.1.5.3.2.1.1.1.1"> <span class="ltx_p" id="S0.T1.1.5.3.2.1.1.1.1.1">Extraction of data subsets</span> </span> </span></span> </span> </td> <td class="ltx_td ltx_nopad_r ltx_align_justify ltx_align_top" id="S0.T1.1.5.3.3"> <span class="ltx_inline-block ltx_align_top" id="S0.T1.1.5.3.3.1"> <span class="ltx_p" id="S0.T1.1.5.3.3.1.1" style="width:0.0pt;"> <span class="ltx_inline-block ltx_minipage ltx_align_top" id="S0.T1.1.5.3.3.1.1.1" style="width:433.6pt;"> <span class="ltx_quote" id="S0.T1.1.5.3.3.1.1.1.1"> <span class="ltx_p" id="S0.T1.1.5.3.3.1.1.1.1.1">Dataset, Resultset</span> </span> </span></span> </span> </td> </tr> <tr class="ltx_tr" id="S0.T1.1.6.4"> <td class="ltx_td ltx_align_justify ltx_align_top" id="S0.T1.1.6.4.1"> <span class="ltx_inline-block ltx_align_top" id="S0.T1.1.6.4.1.1"> <span class="ltx_p" id="S0.T1.1.6.4.1.1.1" style="width:0.0pt;"> <span class="ltx_inline-block ltx_minipage ltx_align_top" id="S0.T1.1.6.4.1.1.1.1" style="width:433.6pt;"> <span class="ltx_quote" id="S0.T1.1.6.4.1.1.1.1.1"> <span class="ltx_p" id="S0.T1.1.6.4.1.1.1.1.1.1">Governance</span> </span> </span></span> </span> </td> <td class="ltx_td ltx_align_justify ltx_align_top" id="S0.T1.1.6.4.2"> <span class="ltx_inline-block ltx_align_top" id="S0.T1.1.6.4.2.1"> <span class="ltx_p" id="S0.T1.1.6.4.2.1.1" style="width:0.0pt;"> <span class="ltx_inline-block ltx_minipage ltx_align_top" id="S0.T1.1.6.4.2.1.1.1" style="width:433.6pt;"> <span class="ltx_quote" id="S0.T1.1.6.4.2.1.1.1.1"> <span class="ltx_p" id="S0.T1.1.6.4.2.1.1.1.1.1">Selecting particular datasets for use</span> </span> </span></span> </span> </td> <td class="ltx_td ltx_nopad_r ltx_align_justify ltx_align_top" id="S0.T1.1.6.4.3"> <span class="ltx_inline-block ltx_align_top" id="S0.T1.1.6.4.3.1"> <span class="ltx_p" id="S0.T1.1.6.4.3.1.1" style="width:0.0pt;"> <span class="ltx_inline-block ltx_minipage ltx_align_top" id="S0.T1.1.6.4.3.1.1.1" style="width:433.6pt;"> <span class="ltx_quote" id="S0.T1.1.6.4.3.1.1.1.1"> <span class="ltx_p" id="S0.T1.1.6.4.3.1.1.1.1.1">Dataset</span> </span> </span></span> </span> </td> </tr> <tr class="ltx_tr" id="S0.T1.1.7.5"> <td class="ltx_td ltx_align_justify ltx_align_top" id="S0.T1.1.7.5.1"> <span class="ltx_inline-block ltx_align_top" id="S0.T1.1.7.5.1.1"> <span class="ltx_p" id="S0.T1.1.7.5.1.1.1" style="width:0.0pt;"> <span class="ltx_inline-block ltx_minipage ltx_align_top" id="S0.T1.1.7.5.1.1.1.1" style="width:433.6pt;"> <span class="ltx_quote" id="S0.T1.1.7.5.1.1.1.1.1"> <span class="ltx_p" id="S0.T1.1.7.5.1.1.1.1.1.1">Bulk object processing</span> </span> </span></span> </span> </td> <td class="ltx_td ltx_align_justify ltx_align_top" id="S0.T1.1.7.5.2"> <span class="ltx_inline-block ltx_align_top" id="S0.T1.1.7.5.2.1"> <span class="ltx_p" id="S0.T1.1.7.5.2.1.1" style="width:0.0pt;"> <span class="ltx_inline-block ltx_minipage ltx_align_top" id="S0.T1.1.7.5.2.1.1.1" style="width:433.6pt;"> <span class="ltx_quote" id="S0.T1.1.7.5.2.1.1.1.1"> <span class="ltx_p" id="S0.T1.1.7.5.2.1.1.1.1.1">Indexing, classification, clustering</span> </span> </span></span> </span> </td> <td class="ltx_td ltx_nopad_r ltx_align_justify ltx_align_top" id="S0.T1.1.7.5.3"> <span class="ltx_inline-block ltx_align_top" id="S0.T1.1.7.5.3.1"> <span class="ltx_p" id="S0.T1.1.7.5.3.1.1" style="width:0.0pt;"> <span class="ltx_inline-block ltx_minipage ltx_align_top" id="S0.T1.1.7.5.3.1.1.1" style="width:433.6pt;"> <span class="ltx_quote" id="S0.T1.1.7.5.3.1.1.1.1"> <span class="ltx_p" id="S0.T1.1.7.5.3.1.1.1.1.1">Whole-of-workflow</span> </span> </span></span> </span> </td> </tr> <tr class="ltx_tr" id="S0.T1.1.8.6"> <td class="ltx_td ltx_align_justify ltx_align_top" id="S0.T1.1.8.6.1"> <span class="ltx_inline-block ltx_align_top" id="S0.T1.1.8.6.1.1"> <span class="ltx_p" id="S0.T1.1.8.6.1.1.1" style="width:0.0pt;"> <span class="ltx_inline-block ltx_minipage ltx_align_top" id="S0.T1.1.8.6.1.1.1.1" style="width:433.6pt;"> <span class="ltx_quote" id="S0.T1.1.8.6.1.1.1.1.1"> <span class="ltx_p" id="S0.T1.1.8.6.1.1.1.1.1.1">Document analysis</span> </span> </span></span> </span> </td> <td class="ltx_td ltx_align_justify ltx_align_top" id="S0.T1.1.8.6.2"> <span class="ltx_inline-block ltx_align_top" id="S0.T1.1.8.6.2.1"> <span class="ltx_p" id="S0.T1.1.8.6.2.1.1" style="width:0.0pt;"> <span class="ltx_inline-block ltx_minipage ltx_align_top" id="S0.T1.1.8.6.2.1.1.1" style="width:433.6pt;"> <span class="ltx_quote" id="S0.T1.1.8.6.2.1.1.1.1"> <span class="ltx_p" id="S0.T1.1.8.6.2.1.1.1.1.1">Making information elements in a document available to finer grained processes</span> </span> </span></span> </span> </td> <td class="ltx_td ltx_nopad_r ltx_align_justify ltx_align_top" id="S0.T1.1.8.6.3"> <span class="ltx_inline-block ltx_align_top" id="S0.T1.1.8.6.3.1"> <span class="ltx_p" id="S0.T1.1.8.6.3.1.1" style="width:0.0pt;"> <span class="ltx_inline-block ltx_minipage ltx_align_top" id="S0.T1.1.8.6.3.1.1.1" style="width:433.6pt;"> <span class="ltx_quote" id="S0.T1.1.8.6.3.1.1.1.1"> <span class="ltx_p" id="S0.T1.1.8.6.3.1.1.1.1.1">Document, derived dataset</span> </span> </span></span> </span> </td> </tr> <tr class="ltx_tr" id="S0.T1.1.9.7"> <td class="ltx_td ltx_align_justify ltx_align_top" id="S0.T1.1.9.7.1"> <span class="ltx_inline-block ltx_align_top" id="S0.T1.1.9.7.1.1"> <span class="ltx_p" id="S0.T1.1.9.7.1.1.1" style="width:0.0pt;"> <span class="ltx_inline-block ltx_minipage ltx_align_top" id="S0.T1.1.9.7.1.1.1.1" style="width:433.6pt;"> <span class="ltx_quote" id="S0.T1.1.9.7.1.1.1.1.1"> <span class="ltx_p" id="S0.T1.1.9.7.1.1.1.1.1.1">KG Management</span> </span> </span></span> </span> </td> <td class="ltx_td ltx_align_justify ltx_align_top" id="S0.T1.1.9.7.2"> <span class="ltx_inline-block ltx_align_top" id="S0.T1.1.9.7.2.1"> <span class="ltx_p" id="S0.T1.1.9.7.2.1.1" style="width:0.0pt;"> <span class="ltx_inline-block ltx_minipage ltx_align_top" id="S0.T1.1.9.7.2.1.1.1" style="width:433.6pt;"> <span class="ltx_quote" id="S0.T1.1.9.7.2.1.1.1.1"> <span class="ltx_p" id="S0.T1.1.9.7.2.1.1.1.1.1">Est’ment of state of complex, modular KGs, change tracking, support for automated up-dates</span> </span> </span></span> </span> </td> <td class="ltx_td ltx_nopad_r ltx_align_justify ltx_align_top" id="S0.T1.1.9.7.3"> <span class="ltx_inline-block ltx_align_top" id="S0.T1.1.9.7.3.1"> <span class="ltx_p" id="S0.T1.1.9.7.3.1.1" style="width:0.0pt;"> <span class="ltx_inline-block ltx_minipage ltx_align_top" id="S0.T1.1.9.7.3.1.1.1" style="width:433.6pt;"> <span class="ltx_quote" id="S0.T1.1.9.7.3.1.1.1.1"> <span class="ltx_p" id="S0.T1.1.9.7.3.1.1.1.1.1">Graph (Dataset)</span> </span> </span></span> </span> </td> </tr> <tr class="ltx_tr" id="S0.T1.1.10.8"> <td class="ltx_td ltx_align_justify ltx_align_top ltx_border_tt" id="S0.T1.1.10.8.1"> <span class="ltx_inline-block ltx_align_top" id="S0.T1.1.10.8.1.1"> <span class="ltx_p" id="S0.T1.1.10.8.1.1.1" style="width:0.0pt;">()</span> </span> </td> <td class="ltx_td ltx_align_justify ltx_align_top ltx_border_tt" id="S0.T1.1.10.8.2"></td> <td class="ltx_td ltx_align_justify ltx_align_top ltx_border_tt" id="S0.T1.1.10.8.3"></td> </tr> </tbody> </table> </figure> <div class="ltx_para" id="p50"> <blockquote class="ltx_quote" id="p50.1"> <p class="ltx_p" id="p50.1.1">TABLE I <br class="ltx_break"/>A LIST OF PROJECT FUNCTIONS, EXAMPLES OF THEM AND (OUR) REQUIRED PROVENANCE GRANULARITY</p> <p class="ltx_p" id="p50.1.2">stores and different information the board frameworks. This empowers consistent incorporation of Git-oversaw resources with the more extensive provenance environ-ment, without the requirement for complex Git-to-PROV mappings like Git2PROV [4]. By referring to substances in Git utilizing URIs, we can safeguard the honesty of our provenance following and keep a reliable, brought together model of information and cycle connections. Git storehouses can be both public and private, and the provenance data connected with every resource is put away and overseen in a manner that guarantees straight-forwardness and discernibility. For more data about Git, visit https://git-scm.com/.</p> <p class="ltx_p" id="p50.1.3"><em class="ltx_emph ltx_font_italic" id="p50.1.3.1">B. General-reason Provenance Following Tools</em></p> <p class="ltx_p" id="p50.1.4">Notwithstanding the significant devices referenced above, we additionally use a few broadly useful frameworks for explicit provenance following undertakings. These devices assist us with keeping up with adaptability in overseeing provenance across an extensive variety of venture types and information sources.</p> <p class="ltx_p" id="p50.1.5"><em class="ltx_emph ltx_font_italic" id="p50.1.5.1">•</em> <span class="ltx_text ltx_font_bold" id="p50.1.5.2">RDFlib</span> RDFlib is a broadly useful Python library for working with RDF (Asset Portrayal System) information.</p> <p class="ltx_p" id="p50.1.6">It is generally utilized in our association for controlling RDF charts, which are the central information struc-tures for addressing connections between substances in a semantic setting. A large number of our information objects, including provenance information, are addressed as RDF charts, which takes into consideration reliable</p> </blockquote> </div> <div class="ltx_para" id="p51"> <img alt="[Uncaptioned image]" class="ltx_graphics ltx_img_landscape" height="203" id="p51.g1" src="extracted/6126849/vertopal_a3830e0462ac42ceaa38505a019a9e40/media/image2.png" width="712"/> </div> <div class="ltx_para" id="p52"> <p class="ltx_p" id="p52.1">Fig. 2. SURROUND’s provenance tools linked to system type</p> </div> <div class="ltx_para" id="p53"> <blockquote class="ltx_quote" id="p53.1"> <p class="ltx_p" id="p53.1.1">and adaptable control. One vital component of RDFlib is its capacity to help reified provenance, which in-cludes making definite records of the setting in which RDF explanations are made. This reification cycle is fundamental for keeping up with the trustworthiness of provenance data and guaranteeing that each move made on the information can be followed back to its starting point. We keep up with different RDFlib code blocks to work with the creation and control of reified provenance for RDF articulations, permitting us to protect the full history of information changes inside our frameworks.</p> </blockquote> </div> <div class="ltx_para" id="p54"> <p class="ltx_p" id="p54.1"><em class="ltx_emph ltx_font_italic" id="p54.1.1">C. Provenance Following Workflow</em></p> </div> <div class="ltx_para" id="p55"> <p class="ltx_p" id="p55.1">The general work process for following provenance inside our association starts with the distinguishing proof of the resources associated with a specific venture or information handling task. Every resource is relegated an extraordinary URI to guarantee that it tends to be dependably referred to across various frameworks and instruments. As the resource goes through changes — like alterations, handling, or exam-ination — the provenance of each activity is kept in PROV-DM design, catching subtleties, for example, the substance in question, the activity performed, and the hour of the activity.</p> </div> <div class="ltx_para" id="p56"> <p class="ltx_p" id="p56.1">This provenance information is then incorporated into the applicable devices and frameworks, guaranteeing that it is available for later recovery, examination, and check. Whether through SOP, ProvWF, or Git, the provenance information is put away in a manner that permits it to be effectively questioned and imagined. On account of complicated work processes, provenance packs are moved between frameworks to guarantee that all means in the process are kept in a rational way.</p> </div> <div class="ltx_para" id="p57"> <p class="ltx_p" id="p57.1">By taking on this design, we guarantee that each move toward our information handling pipelines is discernible, straightforward, and certain, which is fundamental for keep-ing up with exclusive requirements of information honesty, administration, and responsibility.</p> </div> <div class="ltx_para" id="p58"> <blockquote class="ltx_quote" id="p58.1"> <p class="ltx_p" id="p58.1.1">IV. ASPECTS OF OUR PROVENANCE MODELLING</p> <p class="ltx_p" id="p58.1.2">A lot of our provenance displaying will be recognizable to PROV clients: chains of <em class="ltx_emph ltx_font_italic" id="p58.1.2.1">Activities</em> and <em class="ltx_emph ltx_font_italic" id="p58.1.2.2">Entities</em> related with <em class="ltx_emph ltx_font_italic" id="p58.1.2.3">Agents</em>. Notwithstanding, we have experienced a few task explicit situations that require slight specializations. These situations, which are definite in the accompanying venture con-textual analyses, show the assorted utilizations of provenance in our work and the customizations we have made to help the fluctuating necessities of various work processes.</p> <p class="ltx_p" id="p58.1.3"><em class="ltx_emph ltx_font_italic" id="p58.1.3.1">A. Project 1: Electronic Records Assessment</em></p> </blockquote> </div> <div class="ltx_para" id="p59"> <p class="ltx_p" id="p59.1">A new undertaking inside our association zeroed in on the use of Regular Language Handling (NLP) and Information Charts (KGs) to order electronic records for the end goal of filing. This undertaking included extricating components of records’ substance, looking at them against oversaw well-springs of <em class="ltx_emph ltx_font_italic" id="p59.1.1">context</em> introduced as KGs, and utilizing AI (ML) methods to get familiar with the ideal grouping procedures. The test of effectively arranging such records requires nitty gritty provenance following at different levels of the informa-tion lifecycle, from content extraction to the use of AI models. The focal part of this undertaking’s provenance catch was the utilization of information administrations — our char-acterization work process framework questioning our own KG administrations. We followed both the general work process provenance and the singular information proclamation provenance. The previous gave a significant level perspective on the whole characterization process, while the last option empowered us to approve the orders made inside a record’s metadata.</p> </div> <div class="ltx_para" id="p60"> <blockquote class="ltx_quote" id="p60.1"> <p class="ltx_p" id="p60.1.1">Figure <span class="ltx_text ltx_font_bold" id="p60.1.1.1">??</span> outlines our model for following information administration inquiries inside the work process. The full work process provenance is crucial for follow the designs that lead to explicit outcomes, giving straightforwardness into how various information setups impact the last results. Every execution occurrence of the work process and its constituent activities are recorded as PROV Activity occasions, while the infor-mation — like records’ substance and arranged metadata —is caught as PROV Entity cases. This permits us to keep an</p> </blockquote> </div> <div class="ltx_para" id="p61"> <img alt="[Uncaptioned image]" class="ltx_graphics ltx_img_landscape" height="166" id="p61.g1" src="extracted/6126849/vertopal_a3830e0462ac42ceaa38505a019a9e40/media/image3.png" width="712"/> </div> <div class="ltx_para" id="p62"> <p class="ltx_p" id="p62.1">Fig. 3. <em class="ltx_emph ltx_font_italic" id="p62.1.1">ProvWF</em> is often used to generate RDF data - here the “Interim Product” - which can be supplied to <em class="ltx_emph ltx_font_italic" id="p62.1.2">SOP</em> with an acompanying provenance Bundle.</p> </div> <div class="ltx_para" id="p63"> <p class="ltx_p" id="p63.1"><em class="ltx_emph ltx_font_italic" id="p63.1.1">SOP</em>, in turn, generates both Bundles of provenance for any actions on data it performs and also records usage provenance for products</p> </div> <div class="ltx_para" id="p64"> <p class="ltx_p" id="p64.1">exact and finish record of how every information component was changed in the interim.</p> </div> <div class="ltx_para" id="p65"> <p class="ltx_p" id="p65.1">Individual information explanation provenance is especially urgent for the groupings performed on record metadata, put away in RDF design. This permits every characterization result to be confirmed freely. Work process provenance, metadata, and metadata provenance are completely put away in a RDF data set, with cross-references between the components to guarantee that all parts of the cycle are detectable and un-questionable.</p> </div> <div class="ltx_para" id="p66"> <p class="ltx_p" id="p66.1">In the years since the distribution of PROV, we have noticed different expansions and specializations custom-made for work process provenance. One such expansion is PROV-Wf [5], which we embraced to deal with work process explicit provenance. Notwithstanding, we have tracked down that the Plan, or directions for executing work processes, are normally implanted inside the work process’ characterizing programming code. Thus, our <em class="ltx_emph ltx_font_italic" id="p66.1.1">ProvWF</em> instrument records a URI reference to the particular rendition of the code (either a Git <em class="ltx_emph ltx_font_italic" id="p66.1.2">commit</em> or <em class="ltx_emph ltx_font_italic" id="p66.1.3">release</em>) that was executed for every work process. This guarantees that the exact form of the product that delivered the outcomes can be followed back, keeping an unmistakable connection between the work process execution and the basic code.</p> </div> <div class="ltx_para" id="p67"> <p class="ltx_p" id="p67.1"><em class="ltx_emph ltx_font_italic" id="p67.1.1">ProvWF</em> requires custom PROV-style logging to be char-acterized for every custom <em class="ltx_emph ltx_font_italic" id="p67.1.2">Block</em> (work process part). Be that as it may, if predefined <em class="ltx_emph ltx_font_italic" id="p67.1.3">Blocks</em> from our <em class="ltx_emph ltx_font_italic" id="p67.1.4">Block Library</em> are utilized, this errand becomes less complex. While we have investigated frameworks that create PROV-viable work process provenance all the more consequently —, for example, the methodology introduced in [6] — we have found that the degree of work process determination expected by these frameworks, especially the utilization of particular business process demonstrating dialects, surpasses the granularity of provenance we want for our work processes. These strate-gies likewise add intricacy to characterizing non-executable information structures, making it more bulky than essentially characterizing PROV straightforwardly.</p> </div> <div class="ltx_para" id="p68"> <p class="ltx_p" id="p68.1">In late work on logical work process provenance [7], there has been an emphasis on demonstrating control stream to respond to questions like: ”What are the explanations behind</p> </div> <div class="ltx_para" id="p69"> <blockquote class="ltx_quote" id="p69.1"> <p class="ltx_p" id="p69.1.1">different outcomes in two executions of a work process?”While we right now don’t carry out such control-stream displaying in <em class="ltx_emph ltx_font_italic" id="p69.1.1.1">ProvWF</em> or different devices, we address work processes as a straightforward <em class="ltx_emph ltx_font_italic" id="p69.1.1.2">Workflow</em> containing <em class="ltx_emph ltx_font_italic" id="p69.1.1.3">Blocks</em> organized directly over the long run. All control stream choices are subsumed into the <em class="ltx_emph ltx_font_italic" id="p69.1.1.4">Blocks</em>, which, while making them more perplexing, have permitted us to successfully demonstrate all the significant control stream choices inside the work process. Would it be advisable for us we choose to zero in more intently on control stream components in later activities, we expect to address these as specific <em class="ltx_emph ltx_font_italic" id="p69.1.1.5">Blocks</em> with templated (anticipated) data sources and results. By contrasting examples of these particular <em class="ltx_emph ltx_font_italic" id="p69.1.1.6">Blocks</em>, we can acquire understanding into the particular control stream decisions made during work process execution.</p> <p class="ltx_p" id="p69.1.2"><em class="ltx_emph ltx_font_italic" id="p69.1.2.1">B. Project 2: Report Semantic Querying</em></p> <p class="ltx_p" id="p69.1.3">In another new venture, we decayed an enormous industry detail report into primary components as well as semantic parts, for example, state implications, equivalents, outline portrayals, phrasing records, and calculation components. This deterioration was finished to work with further developed normal language inquiries and to help gullible looking through inside the report. To follow the viability of various substance decay strategies and reference datasets — like vocabular-ies of industry-explicit terms — we carried out a multi-framework provenance following system. This permitted us to exhaustively display the advancement of datasets and the collaborations among inquiries and their outcomes.</p> <p class="ltx_p" id="p69.1.4">We demonstrated the different datasets, large numbers of which were KGs, as PROV <em class="ltx_emph ltx_font_italic" id="p69.1.4.1">Entity</em> examples. We followed the condition of these datasets after some time as inquiries were made to the multi-part framework. Each question was treated as a PROV Activity performed by an unknown Agent, and we utilized web logs to remove results and track changes in the KG state, similar as the strategy utilized by a portion of our creators in past work [8]. This permitted us to catch the provenance of question execution, including the particular datasets questioned and the outcomes returned, giving definite bits of knowledge into the cooperations among clients and the framework.</p> </blockquote> </div> <div class="ltx_para" id="p70"> <p class="ltx_p" id="p70.1">Following the reference dataset state in this task was fin-ished utilizing our SOP apparatus, which has consolidated chart state following abilities throughout the long term. This empowers us to catch provenance connected with changes, new information additions, and different alterations inside datasets. In SOP, we can allude to the condition of a whole assortment of resources utilizing a solitary URI reference —a <em class="ltx_emph ltx_font_italic" id="p70.1.1">version</em> of a resource assortment. This formed reference is then utilized in the provenance records of work processes and questions, guaranteeing that cross-questioning provenance is conceivable. By connecting the questions and work processes to explicit adaptations of dataset assortments, we guarantee that every one of the information engaged with the interaction is precisely followed and connected to its state at the hour of purpose.</p> </div> <div class="ltx_para" id="p71"> <p class="ltx_p" id="p71.1">This undertaking likewise elaborate utilizing the SOP appa-ratus’ ability to follow individual components inside datasets, guaranteeing that the in general dataset as well as the parts inside it were precisely followed after some time. This degree of granularity is fundamental for keeping up with the honesty of the information and guaranteeing that changes to explicit terms or definitions inside the archive are caught as a compo-nent of the general provenance.</p> </div> <div class="ltx_para" id="p72"> <p class="ltx_p" id="p72.1">Through these two tasks, we have shown the adaptability and versatility of our provenance design. Whether managing complex work processes for record grouping or following semantic deterioration and questioning inside an enormous report, we have utilized tweaked ways to deal with guarantee that provenance is caught precisely and such that upholds confirmation, straightforwardness, and reproducibility.</p> </div> <div class="ltx_para" id="p73"> <blockquote class="ltx_quote" id="p73.1"> <p class="ltx_p" id="p73.1.1">V. REFLECTIONS ON PROV MODELLING</p> </blockquote> </div> <div class="ltx_para" id="p74"> <p class="ltx_p" id="p74.1">Throughout the span of our work with provenance dis-playing, especially with the PROV-DM model in its PROV-O structure, we have come to profoundly see the value in the diagram based nature of the model. The capacity to address provenance as a diagram offers us an instinctive and adaptable method for displaying complex connections between different components, like <em class="ltx_emph ltx_font_italic" id="p74.1.1">Entities</em>, <em class="ltx_emph ltx_font_italic" id="p74.1.2">Activities</em>, and <em class="ltx_emph ltx_font_italic" id="p74.1.3">Agents</em>. We would say, the chart based design of PROV has been key for catching the perplexing interconnections inside multi-framework work processes, giving an unmistakable and strong perspective on the provenance of information and cycles.</p> </div> <div class="ltx_para" id="p75"> <p class="ltx_p" id="p75.1">One of the vital qualities of PROV, especially with regards to the RDF execution, is its utilization of item distinguishing proof. By utilizing exceptional identifiers for every element, movement, and specialist, we can make a clear cut and tireless record of provenance across various frameworks. This has empowered us to store provenance information in different sorts of frameworks while as yet keeping up with the ca-pacity to cross-question and examine this information. This adaptability is critical, as it permits us to work with various sorts of data sets and information stockpiling frameworks, from straightforward social data sets to more complex chart based frameworks, without losing the detectability of the information.</p> </div> <div class="ltx_para" id="p76"> <blockquote class="ltx_quote" id="p76.1"> <p class="ltx_p" id="p76.1.1">Furthermore, PROV’s utilization of extensible diagrams has been exceptionally gainful for our work. As our tasks fre-quently require displaying complex frameworks with changing degrees of detail, the capacity to broaden the PROV model with custom credits and connections has permitted us to catch the essential intricacy while keeping up with consistency and interoperability across various devices and frameworks. The extensibility of PROV has permitted us to consolidate space explicit data without compromising the center construction of the provenance model, which is a critical consider guarantee-ing that the model remaining parts both versatile and versatile.</p> <p class="ltx_p" id="p76.1.2">In view of the qualities of PROV, we have had the option to show ”anything” in our frameworks at different degrees of granularity. This flexibility has been particularly valuable when we want to catch provenance at various scales, from following fine-grained insights concerning individual informa-tion components to seeing significant level work processes and their general effects. We can store provenance data for whole work processes, for example, the means engaged with handling information, or for individual information components, like the changes or choices that lead to explicit results. This capacity to show both full scale and miniature degrees of provenance has been significant in our capacity to perform point by point reviews, total outcomes, and uncover bits of knowledge from complex frameworks.</p> <p class="ltx_p" id="p76.1.3">In spite of the many benefits of PROV, there have been a few difficulties and restrictions that we have experienced. These difficulties essentially originate from situations where the implicit abilities of PROV don’t completely line up with the particular requirements of our utilization cases. While these issues have not kept us from effectively executing provenance models, they have expected us to foster custom arrangements and expansions. Underneath, we dive into the absolute most huge issues we have confronted:</p> <p class="ltx_p" id="p76.1.4">1) <span class="ltx_text ltx_font_bold" id="p76.1.4.1">Difficulty in Putting away Complex Information in</span> <span class="ltx_text ltx_font_bold" id="p76.1.4.2">Provenance Graphs</span></p> <p class="ltx_p" id="p76.1.5"><em class="ltx_emph ltx_font_italic" id="p76.1.5.1">•</em> A striking test we have experienced includes putting away complex information objects inside our prove-nance charts. While PROV succeeds at addressing the connections between elements, exercises, and specialists, we frequently need to relate compli-cated, organized information with explicit prove-nance records. By and large, we would rather not store these perplexing articles independently from the provenance information charts, as this would subvert the trustworthiness of the information model and bring pointless intricacy into our frameworks. Be that as it may, addressing complex articles inside the provenance chart has demonstrated to be trou-blesome without depending on excessively convo-luted Semantic Web displaying. The test lies in how to encode these items in a way that is both proficient and simple to oversee while safeguarding the con-nections between the various information parts. Fur-thermore, the need to keep up with these perplexing</p> </blockquote> </div> <div class="ltx_para" id="p77"> <img alt="[Uncaptioned image]" class="ltx_graphics ltx_img_landscape" height="215" id="p77.g1" src="extracted/6126849/vertopal_a3830e0462ac42ceaa38505a019a9e40/media/image4.png" width="712"/> </div> <div class="ltx_para" id="p78"> <p class="ltx_p" id="p78.1">Fig. 4. <span class="ltx_text ltx_font_bold" id="p78.1.1">A</span>. A <em class="ltx_emph ltx_font_italic" id="p78.1.2">Service query block</em> from our <em class="ltx_emph ltx_font_italic" id="p78.1.3">Block Library</em>, implemented within our <em class="ltx_emph ltx_font_italic" id="p78.1.4">ProvWF</em> framework using a Query and other configuration (Config) to</p> </div> <div class="ltx_para" id="p79"> <p class="ltx_p" id="p79.1">query a Web Service agent for a Result. Provenance for the web service itself, now considered an entity, is recorded in Git systems and referenced by each</p> </div> <div class="ltx_para" id="p80"> <p class="ltx_p" id="p80.1"><em class="ltx_emph ltx_font_italic" id="p80.1.1">ProvWF</em> execution. <span class="ltx_text ltx_font_bold" id="p80.1.2">B</span>. Reified provenance for a single RDF triple associated with the <em class="ltx_emph ltx_font_italic" id="p80.1.3">ProvWF</em> Block instance that generated it.</p> </div> <div class="ltx_para" id="p81"> <blockquote class="ltx_quote" id="p81.1"> <p class="ltx_p" id="p81.1.1">items inside a similar framework presents expected execution and versatility issues, especially as the size and intricacy of the information increment.</p> <p class="ltx_p" id="p81.1.2">2) <span class="ltx_text ltx_font_bold" id="p81.1.2.1">Linking Entity Occasions to Plan Instances</span></p> <p class="ltx_p" id="p81.1.3"><em class="ltx_emph ltx_font_italic" id="p81.1.3.1">•</em> Another trouble we have confronted is connecting PROV Entity occurrences to Plan examples. While PROV gives a direct method for displaying elements and exercises, the model doesn’t expressly uphold a steady connection among Entity and Plan occurrences, which is critical for following the work processes that created the substances. For instance, in a portion of our ventures, we need to follow back a Entity to the particular Plan or work process that was utilized to create it. This could be especially helpful while examining the im-pacts of various work process setups or grasping the effect of explicit arranging choices on the outcomes. Notwithstanding, PROV comes up short on worked in system to guarantee the extremely durable and queryable relationship among Entity and Plan examples. This limit has expected us to investigate custom arrangements, for example, presenting extra metadata or making custom connections among sub-stances and plans. While this approach has worked by and by, it has added intricacy to our provenance models and presented likely difficulties in keeping up with the consistency of the connections over the long run.</p> </blockquote> </div> <div class="ltx_para" id="p82"> <p class="ltx_p" id="p82.1">In spite of these difficulties, we have kept on working with the PROV model and have fostered a few custom expansions and variations to address the limits we have experienced. These changes have permitted us to keep utilizing PROV successfully while keeping up with the adaptability and versatility that are basic for our undertakings.</p> </div> <div class="ltx_para" id="p83"> <p class="ltx_p" id="p83.1">VI. CONCLUSIONS</p> </div> <div class="ltx_para" id="p84"> <blockquote class="ltx_quote" id="p84.1"> <p class="ltx_p" id="p84.1.1">Taking everything into account, Encompass has had the op-tion to actually use the PROV structure to display provenance across numerous frameworks and inside different IT areas. Our capacity to adjust the PROV model to address the issues of our particular use cases has been a critical calculate the outcome of our ventures. By utilizing the diagram based design of PROV, we have had the option to catch definite connections between substances, exercises, and specialists, guaranteeing that we can follow the genealogy of information and figure out the work processes that created it. This has furnished us with the adaptability to address complex frameworks and work processes in an unmistakable and effective way.</p> <p class="ltx_p" id="p84.1.2">The utilization of PROV has empowered us to furnish our clients with the certainty and straightforwardness they expect in the present information driven world. By uncovering the provenance of individual outcomes, we can assist our clients with understanding how information is produced and handled, guaranteeing that they can believe the outcomes we give. This is especially significant with regards to complex simulated intelligence/ML applications, where the detectability and logic of results are fundamental for guaranteeing the unwavering quality and legitimacy of the models. Moreover, the capacity to follow and dissect the exhibition of our frameworks through point by point provenance information has permitted us to acquire significant experiences into the adequacy of our work processes and distinguish amazing open doors for develop-ment.</p> <p class="ltx_p" id="p84.1.3">Quite possibly of the main illustration we have advanced during this interaction is the significance of incorporating provenance following into the center of our frameworks. By implanting provenance abilities straightforwardly into our work processes, we have had the option to catch the vital information without presenting pointless above or intricacy. We have created both committed provenance apparatuses and coordinated provenance highlights inside existing frameworks, guaranteeing that provenance is consistently followed at each</p> </blockquote> </div> <div class="ltx_para" id="p85"> <p class="ltx_p" id="p85.1">phase of the interaction. This approach has permitted us to accomplish our targets without falling back on profoundly particular or excessively complex executions of PROV.</p> </div> <div class="ltx_para" id="p86"> <p class="ltx_p" id="p86.1">Looking forward, we guess that our utilization of PROV will keep on developing as our frameworks become more complicated and our necessities become more particular. While we have not yet experienced a requirement for exceptionally specific variants of PROV, we perceive that as our tasks progress, we might have to investigate further developed expansions or customizations to help new necessities. Later on, we might have to additional improve the granularity or explicitness of the provenance we catch, especially as we work with more mind boggling simulated intelligence/ML models or as the size of our frameworks develops.</p> </div> <div class="ltx_para" id="p87"> <p class="ltx_p" id="p87.1">In rundown, the utilization of PROV for provenance demon-strating has been a significant device for Encompass, permit-ting us to catch and track the heredity of information across different frameworks and work processes. The adaptability, versatility, and extensibility of PROV have pursued it an opti-mal decision for our activities, and we anticipate proceeding to utilize and refine this model from here on out. Through our continuous work with PROV, we are certain that we can meet the developing necessities of our association and our clients, guaranteeing that our frameworks stay straightforward, solid, and dependable.</p> </div> <div class="ltx_para" id="p88"> <blockquote class="ltx_quote" id="p88.1"> <p class="ltx_p" id="p88.1.1">REFERENCES</p> </blockquote> </div> <div class="ltx_para" id="p89"> <p class="ltx_p" id="p89.1">[1] H. Knublauch and D. Kontokostas, “Shapes Constraint Language (SHACL),” W3C RDF Data Shapes Working Group, W3C Recommen- dation, 2017. [Online]. Available: https://www.w3.org/TR/shacl/ <br class="ltx_break"/>[2] L. Moreau and P. Missier, “PROV-DM: The PROV Data Model,” World Wide Web Consortium, W3C Recommendation, 2013. [Online].</p> </div> <div class="ltx_para" id="p90"> <p class="ltx_p" id="p90.1">Available: https://www.w3.org/TR/prov-dm/ <br class="ltx_break"/>[3] T. Lebo, S. Sahoo, and D. McGuinness, “PROV-O: The PROV Ontology,” W3C Provenance Working Group, W3C Recommendation, 2013. [Online]. Available: http://www.w3.org/TR/prov-o/ <br class="ltx_break"/>[4] Tom De Nies, Sara Magliacane, Ruben Verborgh, Sam Coppens, Paul Groth, Erik Mannens, and Rik Van de Walle, “Git2PROV: Exposing Version Control System Content as W3C PROV,” in <em class="ltx_emph ltx_font_italic" id="p90.1.1">Proceedings of</em> <em class="ltx_emph ltx_font_italic" id="p90.1.2">the 12th International Semantic Web Conference</em>, vol. II. Springer.</p> </div> <div class="ltx_para" id="p91"> <p class="ltx_p" id="p91.1">[Online]. Available: https://github.com/IDLabResearch/Git2PROV na, E. Ogasawara, J. Dias, and [5] F. Costa, V. Silva, D. de Oliveira, K. Oca˜ <br class="ltx_break"/>M. Mattoso, “Capturing and querying workflow runtime provenance with prov: A practical approach,” in <em class="ltx_emph ltx_font_italic" id="p91.1.1">Proceedings of the Joint EDBT/ICDT</em> <em class="ltx_emph ltx_font_italic" id="p91.1.2">2013 Workshops</em>, ser. EDBT ’13. New York, NY, USA: Association for Computing Machinery, 2013, p. 282–289.</p> </div> <div class="ltx_para" id="p92"> <p class="ltx_p" id="p92.1">[6] A. Prabhune, A. Zweig, R. Stotzka, M. Gertz, and J. Hesser, “Prov2ONE: An Algorithm for Automatically Constructing ProvONE Provenance Graphs,” in <em class="ltx_emph ltx_font_italic" id="p92.1.1">Provenance and Annotation of Data and Processes</em>, ser.</p> </div> <div class="ltx_para" id="p93"> <blockquote class="ltx_quote" id="p93.1"> <p class="ltx_p" id="p93.1.1">Lecture Notes in Computer Science, M. Mattoso and B. Glavic, Eds. <br class="ltx_break"/>Cham: Springer International Publishing, 2016, pp. 204–208.</p> </blockquote> </div> <div class="ltx_para" id="p94"> <p class="ltx_p" id="p94.1">[7] A. S. Butt and P. Fitch, “A provenance model for control-flow driven scientific workflows,” <em class="ltx_emph ltx_font_italic" id="p94.1.1">Data &amp; Knowledge Engineering</em>, p. 101877, Feb. 2021. [Online]. Available: https://linkinghub.elsevier.com/retrieve/ pii/S0169023X21000045 <br class="ltx_break"/>[8] N. J. Car, L. S. Stanford, and A. Sedgmen, “Enabling Web Service Request Citation by Provenance Information,” in <em class="ltx_emph ltx_font_italic" id="p94.1.2">Provenance and</em> <em class="ltx_emph ltx_font_italic" id="p94.1.3">Annotation of Data and Processes: 6th International Provenance and</em> <em class="ltx_emph ltx_font_italic" id="p94.1.4">Annotation Workshop, IPAW 2016, McLean, VA, USA, June 7-8,</em> <em class="ltx_emph ltx_font_italic" id="p94.1.5">2016, Proceedings</em>, M. Mattoso and B. Glavic, Eds. Cham: Springer International Publishing, 2016, pp. 122–133. [Online]. Available: http://dx.doi.org/10.1007/978-3-319-40593-3 10</p> </div> </article> </div> <footer class="ltx_page_footer"> <div class="ltx_page_logo">Generated on Sun Jan 12 15:57:41 2025 by <a class="ltx_LaTeXML_logo" href="http://dlmf.nist.gov/LaTeXML/"><span style="letter-spacing:-0.2em; margin-right:0.1em;">L<span class="ltx_font_smallcaps" style="position:relative; bottom:2.2pt;">a</span>T<span class="ltx_font_smallcaps" style="font-size:120%;position:relative; bottom:-0.2ex;">e</span></span><span style="font-size:90%; position:relative; bottom:-0.2ex;">XML</span><img alt="Mascot Sammy" src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAsAAAAOCAYAAAD5YeaVAAAAAXNSR0IArs4c6QAAAAZiS0dEAP8A/wD/oL2nkwAAAAlwSFlzAAALEwAACxMBAJqcGAAAAAd0SU1FB9wKExQZLWTEaOUAAAAddEVYdENvbW1lbnQAQ3JlYXRlZCB3aXRoIFRoZSBHSU1Q72QlbgAAAdpJREFUKM9tkL+L2nAARz9fPZNCKFapUn8kyI0e4iRHSR1Kb8ng0lJw6FYHFwv2LwhOpcWxTjeUunYqOmqd6hEoRDhtDWdA8ApRYsSUCDHNt5ul13vz4w0vWCgUnnEc975arX6ORqN3VqtVZbfbTQC4uEHANM3jSqXymFI6yWazP2KxWAXAL9zCUa1Wy2tXVxheKA9YNoR8Pt+aTqe4FVVVvz05O6MBhqUIBGk8Hn8HAOVy+T+XLJfLS4ZhTiRJgqIoVBRFIoric47jPnmeB1mW/9rr9ZpSSn3Lsmir1fJZlqWlUonKsvwWwD8ymc/nXwVBeLjf7xEKhdBut9Hr9WgmkyGEkJwsy5eHG5vN5g0AKIoCAEgkEkin0wQAfN9/cXPdheu6P33fBwB4ngcAcByHJpPJl+fn54mD3Gg0NrquXxeLRQAAwzAYj8cwTZPwPH9/sVg8PXweDAauqqr2cDjEer1GJBLBZDJBs9mE4zjwfZ85lAGg2+06hmGgXq+j3+/DsixYlgVN03a9Xu8jgCNCyIegIAgx13Vfd7vdu+FweG8YRkjXdWy329+dTgeSJD3ieZ7RNO0VAXAPwDEAO5VKndi2fWrb9jWl9Esul6PZbDY9Go1OZ7PZ9z/lyuD3OozU2wAAAABJRU5ErkJggg=="/></a> </div></footer> </div> </body> </html>

Pages: 1 2 3 4 5 6 7 8 9 10