scholarly journals Building Norms-Adaptable Agents from Potential Norms Detection Technique (PNDT)

Author(s):  
Moamin A. Mahmoud ◽  
Mohd Sharifuddin Ahmad ◽  
Azhana Ahmad ◽  
Aida Mustapha ◽  
Mohd Zaliman Mohd Yusoff ◽  
...  

This paper presents a contribution to research on norms detection by proposing a technique, which is called the Potential Norms Detection Technique (PNDT). The literature proposes that an agent changes or updates its norms based on the variables of the local environment and the amount of thinking about its behaviour. Consequently, any changes on these two variables cause the agent to use the PNDT to update the norms in complying with the domain’s normative protocol. This technique enables an agent to update its norms even in the absence of sanctions from a third-party enforcement authority as found in some work, which entail sanctions by a third-party to detect and identify the norms. The PNDT consists of five components: agent’s belief base; observation process; Potential Norms Mining Algorithm (PNMA) to detect the potential norms and identify the normative protocol; verification process, which verifies the detected potential norms; and updating process, which updates the agent’s belief base with new normative protocol. The authors then demonstrate the operation of the algorithm by testing it on a typical scenario and analyse the results on several issues.

Author(s):  
Ayssam Elkady ◽  
Jovin Joy ◽  
Tarek Sobh

We are developing a framework (RISCWare) for the modular design and integration of sensory modules, actuation platforms, and task descriptions that will be implemented as a tool to reduce efforts in designing and utilizing robotic platforms. The framework is used to customize robotic platforms by simply defining the available sensing devices, actuation platforms, and required tasks. The main purpose for designing this framework is to reduce the time and complexity of the development of robotic software and maintenance costs, and to improve code and component reusability. Usage of the proposed framework prevents the need to redesign or rewrite algorithms or applications due to changes in the robot’s platform, operating systems, or the introduction of new functionalities. In this paper, the RISCWare framework is developed and described. RISCWare is a robotic middleware used for the integration of heterogeneous robotic components. RISCWare consists of three modules. The first module is the sensory module, which represents sensors that collect information about the remote or local environment. The platform module defines the robotic platforms and actuation methods. The last module is the task-description module, which defines the tasks and applications that the platforms will perform such as teleoperation, navigation, obstacle avoidance, manipulation, 3-D reconstruction, and map building. The plug-and-play approach is one of the key features of RISCWare, which allows auto-detection and auto-reconfiguration of the attached standardized components (hardware and software) according to current system configurations. These components can be dynamically available or unavailable. Dynamic reconfiguration provides the facility to modify a system during its execution and can be used to apply patches and updates, to implement adaptive systems, or to support third-party modules. This automatic detection and reconfiguration of devices and driver software makes it easier and more efficient for end users to add and use new devices and software applications. In addition, the software components should be written in a flexible way to get better usage of the hardware resource and also they should be easy to install/uninstall. Several experiments, performed on the RISCbot II mobile manipulation platform, are described and implemented to evaluate the RISCWare framework with respect to applicability and resource utilization.


Entropy ◽  
2021 ◽  
Vol 23 (10) ◽  
pp. 1294
Author(s):  
Kejia Zhang ◽  
Xu Zhao ◽  
Long Zhang ◽  
Guojing Tian ◽  
Tingting Song

Quantum dual-signature means that two signed quantum messages are combined and expected to be sent to two different recipients. A quantum signature requires the cooperation of two verifiers to complete the whole verification process. As an important quantum signature aspect, the trusted third party is introduced to the current protocols, which affects the practicability of the quantum signature protocols. In this paper, we propose a quantum dual-signature protocol without arbitrator and entanglement for the first time. In the proposed protocol, two independent verifiers are introduced, here they may be dishonest but not collaborate. Furthermore, strongly nonlocal orthogonal product states are used to preserve the protocol security, i.e., no one can deny or forge a valid signature, even though some of them conspired. Compared with existing quantum signature protocols, this protocol does not require a trusted third party and entanglement resources.


2008 ◽  
pp. 1839-1864
Author(s):  
Elisa Bertino ◽  
Barbara Carminati ◽  
Elena Ferrari

In this chapter, we present the main security issues related to the selective dissemination of information (SDI system). More precisely, after provided an overview of the work carried out in this field, we have focused on the security properties that a secure SDI system (SSDI system) must satisfy and on some of the strategies and mechanisms that can be used to ensure them.  Indeed, since XML is the today emerging standard for data exchange over the Web, we have casted our attention on Secure and Selective XML data dissemination (SSXD).  As a result, we have presented a SSXD system providing a comprehensive solution to XML documents. In the proposed chapter, we also consider innovative architecture for the data dissemination, by suggesting a SSXD system exploiting the third-party architecture, since this architecture is receiving growing attention as a new paradigm for data dissemination over the web. In a third-party architecture, there is a distinction between the  Owner  and the Publisher of information. The Owner is the producer of the information, whereas Publishers are responsible for managing (a portion of) the Owner information and for answering user queries. A relevant issue in this architecture is how the Owner can ensure a secure dissemination of its data, even if the data are managed by a third-party. Such scenario requires a redefinition of dissemination mechanisms developed for the traditional SSXD system. Indeed, the traditional techniques cannot be exploited in a third party scenario. For instance, let us consider the traditional digital signature techniques, used to ensure data integrity and authenticity. In a third party scenario, that is, a scenario where a third party may prune some of the nodes of the original document based on user queries, the traditional digital signature is not applicable, since its correctness is based on the requirement that the signing and verification process are performed on exactly the same bits.


Cloud computing usage has been highly increased in past decades, and this has many features to effectively store, organize and process the data. The major concern in the cloud is that security is low and user requires verification process for the data integrity. Third Party Auditing (TPA) technique is applied to verify the integrity of data and various methods has been proposed in TPA for effective performance. The existing methods in TPA has the lower performance in communication overhead and execution time. In this research, Elliptic Curve Digital Signature (ECDS) is proposed to increase the efficiency of the TPA. Bilinear mapping technique is used for verification process without retrieving the data and this helps to reduce the communication overhead. The performance of ECDA is measured and compared with the existing method to analyze the performance.


2020 ◽  
Author(s):  
S. S. Jaya ◽  
K. T. Subhadra

Cloud computing is a growing technology that offers compute, storage and network resources as a service over the internet. It enables the individuals, clients or the enterprises to outsource their data and application software to the cloud server. The services are offered by a cloud service provider (CSP) and the users need to pay for what they use. There are many security concerns needs to be addressed when the data is maintained by third party service provider in cloud. The auditor is introduced to audit the integrity of the data on behalf of the client in order to ensure integrity of data. This can be called as public auditability of data. Recently, two privacy preserving auditing mechanisms named Oruta and Knox are introduced to check the correctness of stored data. In this paper, we try to propose the security flaw of their scheme when active adversaries are involved in cloud storage. An active adversary is capable of modifying the data stored in cloud arbitrarily. This data modification is not being identified by the user and the auditor in the verification process. We try to suggest a solution to resolve this flaw by signing the proof response generated on the cloud server side. Then the signed proof is sent to the trusted third party auditor (TTPA) for verification. The auditor first verifies the signature and for the validation of the proof. The proposed scheme is proved to be secure against active adversary.


Author(s):  
Ika Oktavia Suzanti ◽  
Reza Pulungan

AbstrakMobile Ad-hoc Network (MANET) adalah sekumpulan wireless mobile yang terhubung satu sama lain tanpa infrastruktur yang tetap sehingga perubahan topologi dapat terjadi setiap saat. Protokol routing MANET memiliki dua model yaitu protokol routing reaktif yang membentuk tabel routing hanya saat dibutuhkan dan protokol routing proaktif yang melakukan pemeliharaan tabel routing secara berkala. Properti umum yang harus dipenuhi oleh protokol jaringan ad-hoc adalah route discovery, packet delivery dan loop fredom. AODV merupakan protokol reaktif MANET yang memiliki standar waktu berapa lama sebuah rute dapat digunakan (route validity), sehingga properti route discovery dan packet delivery harus dapat dipenuhi dalam waktu tersebut. Proses verifikasi protokol dilakukan dengan memodelkan spesifikasi protokol menggunakan teknik, tool, dan bahasa matematis. Pada penelitian ini bahasa pemodelan  yang digunakan adalah timed automata, yaitu bahasa pemodelan untuk memodelkan sistem yang memiliki ketergantungan terhadap waktu tertentu pada setiap prosesnya. Verifikasi protokol dilakukan secara otomatis dengan mengggunakan tool model checker UPPAAL.Protokol yang diverifikasi adalah protokol AODV Break Avoidance milik Ali Khosrozadeh dkk dan protokol AODV Reliable Delivery dari Liu-Jian dan Fang-Min. Hasil verifikasi protokol membuktikan bahwa protokol AODV Break Avoidance mampu memenuhi properti route discovery dan protokol AODV Reliable Delivery mampu memenuhi properti packet delivery dalam waktu sesuai dengan spesifikasi. Kata kunci —Verifikasi Protokol, Timed Automata,  AODV, UPPAAL Abstract MANET is a group of wireless mobile that connected one to each other without fixed infrastructure so topology could change at anytime. MANET routing protocol has two models which are reactive routing protocol that built routing table only when needed and proactive routing protocol that maintain routing table periodically. General property which had to be satisfied by ad-hoc network protocol are route discovery, packet delivery and loop freedom. AODV is a reactive protocol in MANET that has time standard to determine how long a route is valid to be used (route validity) so route discovery and packet delivery property should be satisfied in a specifically certain time. Protocol verification process done by modeling protocol specification using technique, tool and mathematic language.In this research protocol modeled using timed automata which is a modeling language that could be used to model a time dependent system in each process. Verification using timed automata can automatically done by UPPAAL tool model checker.Protocol which will be verified are AODV Break Avoidance by Ali Khosrozadeh et al. and AODV Reliable Delivery by Liu Jian and Fang-Min. Result of this protocol verification prove that AODV BA could satisfied route discovery property and AODV Reliable Delivery could satisfied packet delivery property within their specification time. Keywords—Protocol Verification, Timed Automata, AODV, UPPAAL


Author(s):  
Elisa Berino ◽  
Barbara Carminati ◽  
Elena Ferrari

In this chapter, we present the main security issues related to the selective dissemination of information (SDI system). More precisely, after provided an overview of the work carried out in this field, we have focused on the security properties that a secure SDI system (SSDI system) must satisfy and on some of the strategies and mechanisms that can be used to ensure them.  Indeed, since XML is the today emerging standard for data exchange over the Web, we have casted our attention on Secure and Selective XML data dissemination (SSXD).  As a result, we have presented a SSXD system providing a comprehensive solution to XML documents. In the proposed chapter, we also consider innovative architecture for the data dissemination, by suggesting a SSXD system exploiting the third-party architecture, since this architecture is receiving growing attention as a new paradigm for data dissemination over the web. In a third-party architecture, there is a distinction between the  Owner  and the Publisher of information. The Owner is the producer of the information, whereas Publishers are responsible for managing (a portion of) the Owner information and for answering user queries. A relevant issue in this architecture is how the Owner can ensure a secure dissemination of its data, even if the data are managed by a third-party. Such scenario requires a redefinition of dissemination mechanisms developed for the traditional SSXD system. Indeed, the traditional techniques cannot be exploited in a third party scenario. For instance, let us consider the traditional digital signature techniques, used to ensure data integrity and authenticity. In a third party scenario, that is, a scenario where a third party may prune some of the nodes of the original document based on user queries, the traditional digital signature is not applicable, since its correctness is based on the requirement that the signing and verification process are performed on exactly the same bits.


2018 ◽  
Vol 10 (3) ◽  
pp. 297-312 ◽  
Author(s):  
Louise Manning

PurposeThe purpose of this paper is to critique the existing and emerging alternative approaches being used by regulators and industry to verify the presence and efficacy of food safety management systems (FSMS). It is the second paper in a theme issue of Worldwide Hospitality and Tourism Themes, discussing the importance of measuring food safety and quality culture.Design/methodology/approachThis paper, primarily focused on UK examples, examines academic and grey literature to consider the options for effective verification of FSMS with emphasis on the hospitality sector including the use of triangulation.FindingsThird-party certification (TPC) compliance audits alone will not deliver effective verification of the FSMS and the cultural context of how formal systems are implemented, monitored and internally verified. Triangulation needs to be undertaken during the FSMS verification process which at its simplest is a Question, Observe, Measure (QOM) triad and at its more complex involves TPC compliance audits and performance assessment using data analysis methodology and product and environmental testing.Originality/valueThe paper will be of value to practitioners, researchers and other stakeholders involved in the hospitality industry.


Author(s):  
S.V. Palmov ◽  
A.A. Diyazitdinova ◽  
E.S. Artyushkina

Two programs developed by the authors of the article based on artificial intelligence methods are presented. These programs allow solving hidden patterns discovering problems in statistical data: Augur and iWizard-E. The first is based on association rules mining algorithm, and the second is based on a modified CART decision tree. To increase the reliability of the comparative analysis results, four third-party intelligent systems (Deductor, Orange, KNIME and WizWhy) were used in the study, as well as two sets of statistical data, each of which contains sixteen patterns. A series of seven experiments showed significant superiority of iWizard-E over Augur, which is due to a more advanced iWizard-E algorithm. Представлены две разработанные авторами статьи программы на основе методов искусственного интеллекта, позволяющие решать задачи по выявлению скрытых закономерностей в статистических данных: Авгур и iWizard-E. Первая основывается на алгоритме поиска ассоциативных правил, вторая на модифицированном дереве решений CART. Для повышения достоверности результатов сравнительного анализа в исследовании задействованы четыре сторонние интеллектуальные системы (Deductor, Orange, KNIME и WizWhy), а также два набора статистических данных, каждый из которых содержит по 16 закономерностей. Серия из семи экспериментов показала заметное превосходство iWizard-E над Авгур , что обусловлено более совершенным алгоритмом iWizard-E.


Security is the key factor of consideration in the vehicular ad-hoc network (VANET), which is prone to various security dangers. A VANET package gives information on life’s essentials and provides security from detrimental external agencies. This paper presents an outsider-based security approach which secures VANETs condition by verification process, where marks are produced and conveyed to hubs and checked at the measure of any transmission. In the suggested approach, the rise in mobility decreases the packet delivery ratio and performance of proposed protocol is approximately 4% improved as compared to other techniques. Moreover, the escalation in mobility increases the average delay and in case proposed protocol is compared with the group based authentication then the improvement in its performance is approximately 50%.Thus, the proposed approach is completely focused on security and consequently secures the system.


Sign in / Sign up

Export Citation Format

Share Document