isidore.science
https://isidore.science
Your search assistant in Humanities and Social SciencesISIDOREisidore.science
https://isidore.science
https://isidore.science/favicon-32x32.png3232Sobre dos “tablas esteganográficas”. Nueva contribución a la historia de la criptografía en México durante el siglo XIX
http://www.scielo.org.mx/scielo.php?script=sci_arttext&pid=S2448-65312022000101361
https://isidore.science/document/10670/1.ymtxn92022-03-01Cyber Security Politics : Socio-Technological Transformations and Political Fragmentation (Edition 1)This book examines new and challenging political aspects of cyber security and presents it as an issue defined by socio-technological uncertainty and political fragmentation.Structured along two broad themes and providing empirical examples for how socio-technical changes and political responses interact, the first part of the book looks at the current use of cyber space in conflictual settings, while the second focuses on political responses by state and non-state actors in an environment defined by uncertainties. Within this, it highlights four key debates that encapsulate the complexities and paradoxes of cyber security politics from a Western perspective – how much political influence states can achieve via cyber operations and what context factors condition the (limited) strategic utility of such operations; the role of emerging digital technologies and how the dynamics of the tech innovation process reinforce the fragmentation of the governance space; how states attempt to uphold stability in cyberspace and, more generally, in their strategic relations; and how the shared responsibility of state, economy, and society for cyber security continues to be re-negotiated in an increasingly trans-sectoral and transnational governance space.This book will be of much interest to students of cyber security, global governance, technology studies, and international relations.
https://openresearchlibrary.org/viewer/d6bc547f-5613-447f-b899-7798c2d81685
https://isidore.science/document/10670/1.n28g9r2022-02-16Cryptocurrency Valuation: An Explainable AI ApproachCurrently, there are no convincing proxies for the fundamentals of cryptocurrency assets. We propose a new market-to-fundamental ratio, the price-to-utility (PU) ratio, utilizing unique blockchain accounting methods. We then proxy various fundamental-to-market ratios by Bitcoin historical data and find they have little predictive power for short-term bitcoin returns. However, PU ratio effectively predicts long-term bitcoin returns. We verify PU ratio valuation by unsupervised and supervised machine learning. The valuation method informs investment returns and predicts bull markets effectively. Finally, we present an automated trading strategy advised by the PU ratio that outperforms the conventional buy-and-hold and market-timing strategies. We distribute the trading algorithms as open-source software via Python Package Index for future research.
http://arxiv.org/abs/2201.12893
https://isidore.science/document/10670/1.6aqqfb2022-01-30The Privacy-Welfare Trade-off: Effects of Differential Privacy on Influence & Welfare in Social Choice
http://arxiv.org/abs/2201.10115
https://isidore.science/document/10670/1.xco4g82022-01-25AI-based Re-identification of Behavioral Clickstream Data
http://arxiv.org/abs/2201.10351
https://isidore.science/document/10670/1.gxbni72022-01-21Empirical Analysis of EIP-1559: Transaction Fees, Waiting Time, and Consensus SecurityTransaction fee mechanism (TFM) is an essential component of a blockchain protocol. However, a systematic evaluation of the real-world impact of TFMs is still absent. Using rich data from the Ethereum blockchain, mempool, and exchanges, we study the effect of EIP-1559, one of the first deployed TFMs that depart from the traditional first-price auction paradigm. We conduct a rigorous and comprehensive empirical study to examine its causal effect on blockchain transaction fee dynamics, transaction waiting time and security. Our results show that EIP-1559 improves the user experience by making fee estimation easier, mitigating intra-block difference of gas price paid, and reducing users' waiting times. However, EIP-1559 has only a small effect on gas fee levels and consensus security. In addition, we found that when Ether's price is more volatile, the waiting time is significantly higher. We also verify that a larger block size increases the presence of siblings. These findings suggest new directions for improving TFM.
http://arxiv.org/abs/2201.05574
https://isidore.science/document/10670/1.8evlb12022-01-14Safe EquilibriumThe standard game-theoretic solution concept, Nash equilibrium, assumes that all players behave rationally. If we follow a Nash equilibrium and opponents are irrational (or follow strategies from a different Nash equilibrium), then we may obtain an extremely low payoff. On the other hand, a maximin strategy assumes that all opposing agents are playing to minimize our payoff (even if it is not in their best interest), and ensures the maximal possible worst-case payoff, but results in exceedingly conservative play. We propose a new solution concept called safe equilibrium that models opponents as behaving rationally with a specified probability and behaving potentially arbitrarily with the remaining probability. We prove that a safe equilibrium exists in all strategic-form games (for all possible values of the rationality parameters), and prove that its computation is PPAD-hard. We present exact algorithms for computing a safe equilibrium in both 2 and $n$-player games, as well as scalable approximation algorithms.
http://arxiv.org/abs/2201.04266
https://isidore.science/document/10670/1.1ddn212022-01-11Supersingular Group Actions and Post-quantum Key-exchangePublic-key cryptography, discovered 50 years ago by Whitfield Diffie and Martin Hellman,uses pairs of keys (one private, one public) to build secure protocols.Public-key cryptography has become an essential part of everyday securesystems, particularly for digital signatures and key-exchange protocols.However, current public-key cryptographic techniques are threatened bythe development of quantum computers, which can efficiently attack thenumber-theoretic problems that guarantee the security of the most commonpublic-key systems today.In anticipation of that threat, post-quantumpublic-key algorithms, which run on classical computers but resistclassical and quantum adversaries, are being designed. One family ofpost-quantum cryptosystems relies on isogenies, i.e. homomorphismsbetween elliptic curves. In particular, two isogeny-based key-exchangeprotocols are currently under investigation: SIDH (Supersingular IsogenyDiffie-Hellman) and CSIDH (Commutative Supersingular IsogenyDiffie-Hellman). In this thesis, we focus on CSIDH and its underlyingideal class group action. This thesis contains two main contributions.First, a constant-time implementation of CSIDH, meaning that it hasbuilt-in countermeasures to timing, power consumption analysis, andfault-injections attacks. Second, a generalization of the CSIDHkey-exchange protocol, using sets of curves with isogenies to theirconjugates and proving the existence of a free and transitive group action.We also study the underlying isogeny graph structures, aswell as key compression and key validation techniques for thisgeneralized key-exchange protocol.Eventually we use this generalization to study its cryptanalysis application on SIDH.
http://www.theses.fr/2021IPPAX120/document
https://isidore.science/document/10670/1.5jhz2d2021-12-17Cryptocurrencies: They´re really that lucrative?Research background: Cryptocurrency is a digital currency that is intended for online trading. It uses and implements the principles of cryptography to create a distributed, decentralized and secure digital currency. Virtual money is a new and promising branch of the virtual economy that brings many advantages and disadvantages in a global sense. Many people have become involved in cryptocurrency hype because high investments in this digital money have been seized during the pandemic. The rise in revenues from this digital money has gripped the world globally. Purpose of the article: The basic purpose and chosen goal is to analyze the use of cryptomen trading during the global Covid-19 pandemic, as well as investing in these alternative sources of investment, which are gaining more attention every day precisely because of their freedom and detachment. Methods: The article will analyze data that will be compared based on the years before the Covid-19 pandemic and during the Covid-19 pandemic. Based on these data, the investment activity of people, companies, corporations is compared. Findings & Value added: Based on the results in the article, it was found that during the Covid-19 pandemic, the interest in investing in cryptocurrencies increased compared to the interest in investing in cryptocurrencies before the pandemic. The overall result is that people are moving to a new way of holding money, as cryptocurrencies are a new way to the future, as banks are unable to provide such returns from client deposits as cryptocurrencies, but they are associated with much greater risk.
https://www.shs-conferences.org/10.1051/shsconf/202112903002/pdf
https://isidore.science/document/10670/1.uht9bo2021-12-16Study of the resistance of symmetric-key algorithms against modern cryptanalysisThe goal of this thesis is to contribute to the state-of-the-art by proposing new areas of research in order to secure cryptographic algorithms within an embedded device.Our main focal axis is organized around the countermeasure called threshold implementations which is known to be resistant against side-channel analysis attacks in the presence of glitches.These latter phenomenon occur randomly within an electronic circuit and lead to numerous attacks in cryptanalysis. We study the application of threshold implementations on symmetric-key cryptography.In a first phase, we participate to the cryptographic litterature by designing new threshold implementations easily applicable on a large variety of symmetric-key algorithms. Our countermeasures are provable mathematically secured against side-channel analysis attacks in the presence of glitches. In comparison with the recent publications of the state-of-the-art, we adress new issues and we assure similar or better performances. Therefore, our research has resulted in two patents within STMicroelectronics, thereby contributing to the industrial innovation process.In a second phase, we are interested in the study of the symmetric-key algorithm SM4 and its resistance against side-channel analysis attacks. The works obtained allow to centralize the proposed SM4 countermeasures against side-channel analysis attacks of the state-of-the-art and offer a visibility on the software performances of these constructions. We finally introduce the first threshold implementation of the SM4 algorithm. Our construction is provably mathematically resistant against side-channel analysis attacks in the presence of glitches.
http://www.theses.fr/2021SORUS287/document
https://isidore.science/document/10670/1.yfncgx2021-12-10Study and design of new encryption primitives based on rank metric error correcting codesIn 2005, Faure and Loidreau proposed a new rank-metric cryptosystem inspired from the Hamming metric scheme of Augot-Finiasz in 2003. In 2018, it was broken by the attack of Gaborit, Otmani and Kalachi. Recently, there are some attempts of repairing the Faure-Loidreau scheme, for example the work of Renner, Puchinger and Wachter–Zeh which is called LIGA. In this thesis, we also introduce a new cryptosystem so-called RAMESSES which is another repairing of Faure-Loidreau scheme. Besides, we also study about the recent attack of Coggia and Couveur in the Loidreau's cryptosystem (2017). Although they only propose an idea for a special case of the dimension of secret subspace, this attack can be generalized. In this thesis, we propose an analysis of Coggia-Couvreur attack on Loidreau’s rank-metric public-key encryption scheme in the general case. The last part is a study about the decoding of the sum of Gabidulin codes which is inspired from the work of Loidreau in 2005 "Welch-Berlekamp Like Algorithm for Decoding Gabidulin Codes". This work is also an attempt to repair the Loidreau's cryptosystem (2017) to avoid the Coggia-Couveur's attack.
http://www.theses.fr/2021REN1S072/document
https://isidore.science/document/10670/1.1mhfoo2021-12-10Design, analysis and implementation of cryptographic symmetric encryption algorithms on FPGAThis work studies several aspects of design and implementation of symmetric cryptography. The focus was brought on two different kinds of construction, namely lightweight block ciphers and sponge functions providing authenticated encryption. For both the goal is to define solutions ensuring similar security bounds as standards algorithms while achieving good performances towards throughput and low area occupation. The first part of this thesis focuses on the state-of-the art in designing block ciphers and which parameters and construction may lead to the desired performances. We then define a new mode of operation achieving the same security margins as the mode of operation standardized by the NIST and the ANSSI while allowing application where the initialization vector cannot be sent to both correspondents. The second half is based on the study of sponge functions, from the SHA-3 competition to the NIST LWC standardization process, of both mode of operation and permutation to achieve the best performances as possible for different use cases.
http://www.theses.fr/2021UPASG104/document
https://isidore.science/document/10670/1.asuh6v2021-12-08Cyber-Security Investment in the Context of Disruptive Technologies: Extension of the Gordon-Loeb Model
http://arxiv.org/abs/2112.04310
https://isidore.science/document/10670/1.31wrck2021-12-08On decoding algorithms for algebraic geometry codes beyond half the minimum distanceThis thesis deals with algebraic geometric (AG) codes and theirdecoding. Those codes are composed of vectors constructed by evaluatingspecific functions at points of an algebraic curve. The underlyingalgebraic structure of these codes made it possible to design severaldecoding algorithms. A first one, for codes from plane curves isproposed in 1989 by Justesen, Larsen, Jensen, Havemose and Hoholdt. Itis then extended to any curve by Skorobatov and Vladut and called"basic algorithm" in the literature. A few years later, Pellikaan andindependently Koetter, give a formulation without algebraic geometryusing simply the language of codes. This new interpretation, takes thename "Error Correcting Pairs" (ECP) algorithm and represents abreakthrough in coding theory since it applies to every code having acertain structure which is described only in terms of component-wiseproducts of codes. The decoding radius of this algorithm depends onthe code to which it is applied. For Reed-Solomon codes, it reacheshalf the minimum distance, which is the threshold for the solution tobe unique. For AG, the algorithm almost always manages todecode a quantity of errors equal to half the designeddistance. However, the success of the algorithm is only guaranteed fora quantity of errors less than half the designed distance minussome multiple curve's genus. Several attempts were thenmade to erase this genus-proportional penalty. A first decisiveresult was that of Pellikaan, who proved the existence of an algorithmwith a decoding radius equal to half the designed distance. Thenin 1993 Ehrhard obtained an effective procedure for constructing such analgorithm.In addition to the algorithms for unique decoding, AG codes havealgorithms correcting amount of errors greater than half thedesigned distance. Beyond this quantity, the uniqueness of thesolution may not be guaranteed. We then use a so-called "listdecoding" algorithm which returns the list of any possiblesolutions. This is the case of Sudan's algorithm for Reed-Solomoncodes. Another approach consists in designing algorithms, whichreturns a single solution but may fail. This is the case ofthe "power decoding". Sudan's and power decoding algorithms have firstbeen designed for Reed-Solomon codes, then extended to AG codes. Weobserve that these extensions do not have the same decoding radii:that of Sudan algorithm is lower than that of the power decoding,the difference being proportional to the genus of the curve.In this thesis we present two main results. First, we propose a newalgorithm that we call "power error locating pairs" which, like theECP algorithm, can be applied to any code with a certain structuredescribed in terms of component-wise products. Compared to the ECPalgorithm, this algorithm can correct errors beyond half thedesigned distance of the code. Applied to Reed-Solomon or to AG codes,it is equivalent to the power decoding algorithm. But it can also beapplied to specific cyclic codes for which it can be used to decodebeyond half the Roos bound. Moreover, this algorithm applied to AGcodes disregards the underlying geometric structure whichopens up interesting applications in cryptanalysis.The second result aims to erase the penalty proportional to thegenus in the decoding radius of Sudan's algorithm forAG codes. First, by following Pellikaan's method, weprove that such an algorithm exists. Then, by combining andgeneralizing the works of Ehrhard and Sudan, we give aneffective procedure to build this algorithm.
http://www.theses.fr/2021IPPAX101/document
https://isidore.science/document/10670/1.pxnzod2021-12-03On certain types of code-based signaturesDigital signatures were first introduced in the work of DIFFIE and HELLMAN, dated back in 1976. It is a scientific art replacing the traditional way of written signatures. Each signer has a \personal knowledge," or a signing key, to produce signatures. And as the same as handwritten signatures, anyone seeing this signature would be convinced that it belong to a certain person (and no one else). In order to produce such a signature, the signing key is indispensable, and the secret of this entity is usually protected by the hardness assumption of some computational problems. In the earliest stage, these are number theoretic problems such as factoring large integer numbers or computing the discrete logarithm of an element with respect to some prime modulus. However, with the rapid development of technology, these problems will be solved efficiently when the era of quantum computer arrives. Then comes the next stage in the progressing course of digital signatures when most of the attention is given to the decoding problem (and many of its variants), of which the hardness resists even the quantum computer. This problem, however, takes part in two important branches of cryptography, namely, lattice-based cryptography and code-based cryptography due to the main object it is related to. This thesis mainly concerns with signatures in the latter branch, i.e., the code-based cryptography. It proposes two main contributions. The first of which is a signature scheme in the HAMMING metric context. The scheme is achieved as an application of a chameleon hash function, which is constructed entirely from classical code-based hardness assumptions. The most notable feature of this scheme is that it is proved to be secure in the standard model. While security of code-based schemes in the random oracle model is still unclear, such property is highly desirable. The second contribution is a group signature scheme in the rank metric context. In general, the construction of the scheme follow the frame devised for the HAMMING metric. At the core, this frame uses two permutations which are designed from a random vector. Though quite efficient for the binary case, that is, the base field is F2; this method shows its disadvantages when the base field is changed. A natural question arises out of this situation: How can we construct schemes in another fields ? We answer this question by proposing a different method of permuting. Our method has the advantage that it can be applied regardless the metric being in consideration.
http://www.theses.fr/2021LIMO0088/document
https://isidore.science/document/10670/1.kly51e2021-11-30Islam and the Trajectory of Globalization : Rational Idealism and the Structure of World History (Edition 1)The book examines the growing tension between social movements that embrace egalitarian and inclusivist views of national and global politics, most notably classical liberalism, and those that advance social hierarchy and national exclusivism, such as neoliberalism, neoconservatism, and national populism. In exploring issues relating to tensions and conflicts around globalization, the book identifies historical patterns of convergence and divergence rooted in the monotheistic traditions, beginning with the ancient Israelites that dominated the Near East during the Axial age, through Islamic civilization, and finally by considering the idealism-realism tensions in modern times. One thing remained constant throughout the various historical stages that preceded our current moment of global convergence: a recurring tension between transcendental idealism and various forms of realism. Transcendental idealism, which prioritize egalitarian and universal values, pushed periodically against the forces of realism that privilege established law and power structure. Equipped with the idealism-realism framework, the book examines the consequences of European realism that justified the imperialistic venture into Africa, the Middle East, and Latin America in the name of liberation and liberalization. The ill-conceived strategy has, ironically, engendered the very dysfunctional societies that produce the waves of immigrants in constant motion from the South to the North, simultaneously as it fostered the social hierarchy that transfer external tensions into identity politics within the countries of the North. The book focuses particularly on the role played historically by Islamic rationalism in translating the monotheistic egalitarian outlook into the institutions of religious pluralism, legislative and legal autonomy, and scientific enterprise at the foundation of modern society. It concludes by shedding light on the significance of the Muslim presence in Western cultures as humanity draws slowly but consistently towards what we may come to recognize as the Global Age.The Open Access version of this book, available at http://www.taylorfrancis.com/books/e/9781003203360, has been made available under a Creative Commons Attribution-Non Commercial-No Derivatives 4.0 license.
https://openresearchlibrary.org/viewer/d96d063d-1749-44b4-aab3-1a1dc946ec99
https://isidore.science/document/10670/1.jc6sgx2021-11-30Encoding the Sacred: New Lights on Ramesside Enigmatic WritingEnigmatic writing relies on highly creative processes, at the intersection between graphemics and visual semiotics (Werning 2020). When ‘alienating’ hieroglyphic spellings, Egyptian scribes indeed simultaneously considered possible sound values, the figurative dimension of the signs as well as associated mythological motives. This process was most probably supported by brouillons written in cursive scripts on portable media (Haring 2015) and might have been ‘deciphered’ using the same type of devices. The lack of such documents led recent studies to investigate the matter from a strictly theoretical point of view (Klotz & Stauder 2020; Roberson & Klotz 2020). Our paper aims at shedding new light on this transcoding process based on empirical evidence. A fresh analysis of O. Turin CGT 57440 (López 1982: 146–146a) reveals that the hieratic funerary composition on one side is the plain text version of the cryptographic hieroglyphic text on the other side. It is the first known case of transcoding of an enigmatic running text in cursive script on a single document. This ostracon does not only give us the opportunity to compare an enigmatic text with its hieratic equivalent: it also provides a unique glimpse into the New Kingdom scribal approaches to “visual poetry” (Morenz 2008). Furthermore, the text probably refers the actual context in which it was inscribed, namely the tomb of a scribe called Amennakhte. Finally, combining a philological commentary with an examination of the layout and writing practices, we try to answer the following crucial question: should we consider the ostracon as brouillon for a text to be monumentalized or as decoding exercise? In our view, the latter option is much more likely.
https://orbi.uliege.be/handle/2268/265209
https://isidore.science/document/2268/2652092021-11-19Theoretical hardness of algebraically structured learning with errorsThe main focus of this Ph.D thesis lies on the computational problem Learning With Errors (LWE). It is a core building block of lattice-based cryptography, which itself is among the most promising candidates to replace current cryptographic protocols once large-scale quantum computers may be available. The contributions of the present work are separated into two different parts. First, we study the hardness of structured variants of LWE. To this end, we show that under suitable parameter choices the Module Learning With Errors (M-LWE) problem doesn’t become significantly easier to solve even if the underlying secret is replaced by a binary vector. Furthermore, we provide a classical hardness reduction for M-LWE, which further strengthens our confidence in its suitability for cryptography. Additionally, we define a new hardness assumption, the Middle-Product Computational Learning With Rounding (MP-CLWR) problem, which inherits the advantages of two existing LWE variants. Finally, we study problems related to the partial Vandermonde matrix. This is a recent source of hardness assumptions for lattice-based cryptography and its rigorous study is important to gain trust in it. In the second part of this manuscript, we show that the new hardness assumptions we introduced before serve for the construction of efficient public-key encryption. On the one hand, we design a new encryption scheme, whose security is provably based on the MP-CLWR problem. On the other hand, we modify an existing encryption scheme, called PASS Encrypt, to provide it with a security proof based on two explicitly stated partial Vandermonde problems.
http://www.theses.fr/2021REN1S064/document
https://isidore.science/document/10670/1.fm44742021-11-16Designing a Framework for Digital KYC Processes Built on Blockchain-Based Self-Sovereign IdentityKnow your customer (KYC) processes place a great burden on banks, because they are costly, inefficient, and inconvenient for customers. While blockchain technology is often mentioned as a potential solution, it is not clear how to use the technology's advantages without violating data protection regulations and customer privacy. We demonstrate how blockchain-based self-sovereign identity (SSI) can solve the challenges of KYC. We follow a rigorous design science research approach to create a framework that utilizes SSI in the KYC process, deriving nascent design principles that theorize on blockchain's role for SSI.
http://arxiv.org/abs/2112.01237
https://isidore.science/document/10670/1.yjn1ql2021-11-11Revisiting the Properties of Money
http://arxiv.org/abs/2111.04483
https://isidore.science/document/10670/1.6ubmqk2021-11-08Nabokov and the Mathematics of LanguageVladimir Nabokov’s theory of translation has long validated the scholarly image of Nabokov as modernist aesthete. That is, his belief in the “untranslatability” of literary works foregrounds a “mystical submission to aesthetic authority” (McGurl 2011: 5). This article reframes our reading of Nabokov’s prioritization of aesthetics in light of his technical, scientific, and, as I will show, even mathematical conceptualization of linguistic properties. As such, this article contributes to recent...
http://books.openedition.org/apu/23133
https://isidore.science/document/10670/1.m8xicj2021-11-04Security and its challenges in the 21st centuryBy the year 2000, a balance was sought between security requirements and a respect for privacy, as well as for individual and collective freedoms. As we progress further into the 21st century, however, security is taking precedence within an increasingly controlled society.This shift is due to advances in innovative technologies and the investments made by commercial companies to drive constant technological progress. Despite the implementation of the General Data Protection Regulation (GDPR) within the EU in 2018 or 2020’s California Consumer Privacy Act (CCPA), regulatory bodies do not have the ability to fully manage the consequences presented by emerging technologies.Security and Its Challenges in the 21st Century provides students and researchers with an international legal and geopolitical analysis; it is also intended for those interested in societal development, artificial intelligence, smart cities and quantum cryptology.
https://hal.archives-ouvertes.fr/hal-03340151
https://isidore.science/document/10670/1.fynmp52021-11Scaling Blockchains: Can Elected Committees Help?In the high-stakes race to develop more scalable blockchains, some platforms (Cosmos, EOS, TRON, etc.) have adopted committee-based consensus protocols, whereby the blockchain's record-keeping rights are entrusted to a committee of elected block producers. In theory, the smaller the committee, the faster the blockchain can reach consensus and the more it can scale. What's less clear, is whether this mechanism ensures that honest committees can be consistently elected, given voters typically have limited information. Using EOS' Delegated Proof of Stake (DPoS) protocol as a backdrop, we show that identifying the optimal voting strategy is complex and practically out of reach. We empirically characterize some simpler (suboptimal) voting strategies that token holders resort to in practice and show that these nonetheless converge to optimality, exponentially quickly. This yields efficiency gains over other PoS protocols that rely on randomized block producer selection. Our results suggest that (elected) committee-based consensus, as implemented in DPoS, can be robust and efficient, despite its complexity.
http://arxiv.org/abs/2110.08673
https://isidore.science/document/10670/1.g6a2ra2021-10-16Techniques of cryptanalysis for symmetric-key primitivesThis thesis contributes to the cryptanalysis effort needed to trust symmetric-key primitives like block-ciphers or pseudorandom generators. In particular, it studies a family of distinguishers based on subspace trails against SPN ciphers. This thesis also provides methods for modeling frequent cryptanalysis problems into MILP (Mixed-Integer Linear Programming) problems to allow cryptographers to benefit from the existence of very efficient MILP solvers. Finally, it presents techniques to analyze algebraic properties of symmetric-key primitives which could be useful to mount cube attacks.
http://www.theses.fr/2021SORUS217/document
https://isidore.science/document/10670/1.gpg47k2021-10-08Market Microstructure of Non Fungible Tokens
http://arxiv.org/abs/2112.03172
https://isidore.science/document/10670/1.u567552021-10-08