
It is an interesting option to extend this study from different perspectives : green information system, computation of MPPT based on AI, construction of quantitative models foe self healing mechanism and real-time fault diagnostics. A solar computing system must ensure real-time secure monitoring and sense-and-respond modeling; the system should be able to tune itself automatically to an optimal state adaptively. It should be able to anticipate various types of threats automatically that could disturb the stability of the system. Another important task is to isolate the healthy part from the faulty one. A smart power grid is vulnerable to both natural disasters and intentional attacks, physical and cyber challenges and threats of deception. The size and complexity of the grid structure and related cost of erection, testing, commissioning and maintenance are the major constraints to protect the entire infrastructure physically. There are also threats of act of terrorism on the disruption of smart grid and related adverse effects on national security, economy and the quality of life of the common people. Energy security demands fundamental rethinking and radical redesign of the existing power grids globally.
Intelligent business and technical analysis of IIOT, ICS and SCADA requires the availability of critical up-to-data, analytical models and tools for rational, fair and correct evaluation of technology innovation. This study can be extended in various ways. It is interesting to extend the scope of the aforesaid system in various emerging applications such as banking and financial services, defense and e- governance. ICS & SCADA networks are potentially vulnerable to intrusion and various types of malicious cyber attacks which may affect the safety of common people and the performance of critical infrastructure seriously and may cause huge financial loss. It is expected to be a resilient system. This work finds a set of interesting research agenda for future works: how to develop intelligent threat analytics and secure verification algorithms from the perspectives of method, target, identity and protocol to ensure confidentiality, authentication, integrity, availability, accountability and access control of ICS and SCADA infrastructure? how to quantify various parameters of security intelligence? Is it possible to develop automated verification and model checking algorithms for ICS and SCADA? We need a new broad outlook, imagination and dreams to solve a complex problem through a set of simple mechanisms.
The expert panel are highlighting some common SMC tools, it is an interesting option to explore how to construct these tools in the setting of Quantum Computing?
Oblivious transfer: Oblivious transfer (OT) is a bi-party protocol wherein a receiver learns some information regarding the input of the sender such that the sender does not know what the receiver has learnt. The notion of oblivious transfer has several versions. Rabin introduced the concept of oblivious transfer. In the original OT problem, the receiver learns the secret of the sender with probability ½. 1-out-of-2 oblivious transfer (OT12) is a protocol by which a sender (S) transfers ignorantly to a receiver (R) one out of two secret messages (Even, Goldreich and Lempel, 1985). 1-out-of-n oblivious transfer is a protocol between two parties. Here, Alice holds n inputs in a particular sequence which is known only to Bob.. At the end of the protocol, Bob learns only 1 out of n inputs and Alice will not know which input Bob has learnt (Naor and Pinkas, 1999). k-out-of-n oblivious transfer protocol is an extended version of 1-out-of-n oblivious transfer where 1
k
n-1. Distributed oblivious transfer protocol distributes the task of Alice between several servers and each server holds partial information about the secret ( Naor and Pinkas, 2000). The receiver has to contact t (t
1) or more servers to get the secret otherwise it cannot get any information about the secret. Security is ensured as long as a limited number of servers collude. Oblivious polynomial evaluation is a protocol involving two parties, Alice holding input a polynomial P and Bob with input x. At the end of the protocol, Bob learns P(x) but Alice learns nothing about x. Naor and Pinkas (1999) studied an oblivious evaluation of polynomial problem based on oblivious transfer protocol. Oblivious transfer is a fundamental primitive for cryptography and secure distributed computation and has many applications.
Secure function evaluation: Alice with an input x and Bob with an input y want to evaluate a function z = f(x,y) based on their joint inputs in such a way that does not allow any party to gain more information than that is implied by its inputs and the function value. Alice and Bob can achieve this through a protocol known as secure function evaluation. In the field of secure function evaluation, f is represented in various ways - garbled circuit construction), combinatorial circuit, algebraic, as a product of matrices over a large field; low degree polynomial and randomizing polynomials.
Mixnet : A mixnet consists of multiple independent mix-servers and it enables a group of senders to send their messages anonymously. Chaum (1988) introduced the concept of mixnet. Later, several researchers proposed mixnet protocols for the shuffling of encrypted messages. The concept of mixnet has been widely applied to design efficient electronic voting schemes. In this case, each voter sends his encrypted ballot to a mixnet. The Mixnet shuffles the posted ballots such that the voter-vote relationship is lost. After the mixing process, multiple tally servers jointly decrypt the encrypted ballot and count the votes. Mixnet can be classified into verifiable mixnet and optimistic mixnet on the basis of correctness proof. Each server of verifiable mixnet provides the proof that its shuffling is correct. On the other side, optimistic mixnet does not provide the verification of correct shuffling by each server. The correctness of the shuffling of the whole mixnet is verified after it generates the output.
Private comparison : Yao’s millionaire problem is to find out who is richer between two parties such that no information about one party’s value is revealed to the other party. Yao first proposed a protocol for this problem without using any untrusted third party. The cost of the protocol was exponential in both time and space. Later, scrambled circuits has been used to solve this problem at linear cost of computation and communication. Cachin (1999) suggested an efficient solution using an oblivious third party. Fishchlin (2001) used the Goldwasser-Micali (GM) encryption scheme to construct a two round non-interactive crypto-computing GT (greater than) protocol. Two inputs can be compared by verifying the most significant bit in which they are different. Similar bits do not affect the result and the effect of unequal low order bits is overshadowed by the high order bits. Based on this principle, Ioannidis and Grama (2003) proposed a private comparison protocol using oblivious transfer scheme. Schoenmaker and Tuyls (2004) used threshold homomorphic encryption scheme to solve private comparison problem. Blake and Kolesnikov (2004) used the concepts of Q-conditional oblivious transfer and the additive homomorphic Paillier cryptosystem to construct a two round private comparison protocol. The computation cost is O(nlogN) where n is the length of the inputs and N is the size of the plaintext domain of Paillier scheme.
In this session, the panel have analyzed the complexity of emerging technology of secure multi-party quantum computing by reviewing various related works [1-15] - what are the fundamental concepts of quantum computation and quantum information? what are the scopes of application of SMQC? How to develop these concepts? how to develop quantum errors correctling codes and fault tolerant quantum computation? how to develop cryptographic protocols, private and public key cryptosystems in the setting of SMQC? Wht is the future of SMQC? What can quantum computation and quantum information offer to science, technology and humanity? What are the benefits of SMQC upon omputer science, information theory and Physics? What are the key open problems of quantum computation and quantum information? SMQC is the basic building block to think physically about computation and it yields many new exciting capabilities for information processing and communication. Secure multi-party quantum computation offer hard challenges in terms of computation and communication complexity.
This session shows the application of cryptographically secure protocols for kernelized Support Vector Machines of secure adaptive filter.However, there are still many open problems in private SVM classification, learning and secure adaptive filters. An interesting question is how to compute and securely hide the convergence speed of the private SVM algorithms? Are there any iterative private linear classification methods that need no circuit evaluation. Another relevant issue is private computation of encrypted kernel matrices for structured data. A protocol preserves privacy if no agent learns anything more than its output; the only information that should be disclosed about other agent’s inputs is what can be derived from the output itself. Secure multi-party computation preserves privacy of data in different ways; is it possible to apply SMC protocols for secure adaptive filter? What are the algorithms? is the cost of computation and communication high in secure multi-party computation of an adaptive filter? Privacy is one of the primary concerns of secure adaptive filter; is it possible to address the issue utilizing the concept of cryptography and secure multiparty computation? The fundamental objectives of cryptography are to provide confidentiality, data integrity, authentication and non-repudiation. Cryptography ensures privacy and secrecy of information through encryption and decryption methods. The challenge is how to apply existing encryption and decryption algorithms in secure adaptive filter : is it possible to perform quantitative operations of secure adaptive filter on encrypted data? How? Is it possible to apply digital signature for secure adaptive filter? how to apply existing digital signature algorithms in secure adaptive filter : is it possible to perform quantitative operations of secure adaptive filter on digitally signed data? How? Is it possible to apply signcryption for secure adaptive filter? how to apply existing signcryption and unsigncryption algorithms for secure adaptive filter : is it possible to perform private search on signcrypted data? How?
The central message of this summit is that the success of technology innovation projects depends on several factors: strength, weakness, opportunities, threats, technology life-cycle, understanding the needs of consumers, competitive environment, blind spots and the ability to recognize and align the partners associated with the value chain and innovation ecosystem. Deep analytics is essential to coordinate, integrate and synchronize ‘7-S’ elements: scope, system, structure, staff-resources, skill-style-support, security and strategy. Even the most brilliant innovation cannot succeed when its value creation depends on innovation of other technologies. This book evaluates a set of emerging technologies for humanity; most of these technology innovations are at emergence stage, some others are at growth stage. Hopefully, deep analytics should be able to accelerate the pace of these emerging technological innovations. Let us discuss the limitations and future scope of this summit. In fact, the aforesaid content is the summary of the discussions during various sessions of the summit and are focused on emerging technologies associated with electrical and electronics engineering, information and communication technologies and computer science. There are other various branches of science and technology such as Mathematics, Physics, Chemistry, Biology, Earth science, Geology, mechanical, chemical, petrochemicals, oil and gas, pharmacy, biotechnology, genetics, metallurgical, civil, structural, construction, production and power plant engineering. It is essential to have depth and breadth of knowledge to explore emerging technologies in those branches of engineering. Secondly, deep analytics demand the support of quality technical and business data and efficient quantitative tools and techniques to evaluate the potential of a technology, to perform technology life-cycle analysis, technology diffusion, adoption, infusion and innovation, dominant design, blind spots and spillover effects. It is also essential to evaluate these emerging technologies deeply from the perspective of numerical, statistical, quantitative and qualitative analysis based on up-to-date data. In fact, there is no end of this intelligent deep analysis. There is no end of the debate on the strength, weakness, threats and opportunities of emerging technologies and comparison with existing technologies. Let us try to save the world..…
FURTHER READING
Reads:
18
Pages:
40
Published:
Aug 2025
Build high-performance mobile apps using WordPress as a powerful backend system with this actionable guide. Whether you're working with React Native or Flutte...
Formats: PDF, Epub, Kindle, TXT
Reads:
14
Pages:
46
Published:
May 2025
This document presents a complete blueprint for an advanced multimodal cognitive architecture designed to enable true Artificial General Intelligence (AGI).Th...
Formats: PDF, Epub, Kindle, TXT