9 Best Bitcoin Miner Software (Oct 2020)

Recap on CoinEx & Avalanche AMA Aug 5, 2020

Recap on CoinEx & Avalanche AMA Aug 5, 2020
Written by SatoshisAngels
Published by read.cash
On August 5th 2020, Satoshi’s Angels hosted an AMA for CoinEx on “How BCH and Avalanche Are Bringing Financial Freedom to 6 Billion People” on a Chinese platform Bihu. During the 100-minute event, Haipo Yang of ViaBTC and CoinEx, and Emin Gun Sirer of AVA Labs shared their in-depth views on such topics as different consensus mechanisms, community governance, IPFS, Defi. And Haipo explained why he wants to fork BCH. This is the full text.
You can check out the full AMA here (mostly in Chinese with some English translation).

https://preview.redd.it/x790bw58axf51.png?width=1920&format=png&auto=webp&s=03c8af942f8f14d98d5dd693adf9e2a50448d61d
Cindy Wang (Satoshi’s Angels): There are news saying that you are to fork BCH. Is it a marketing makeover? Are you serious about it?
Haipo Yang: It’s definitely not a marketing makeover. But the details are not decided yet.
Over the past three years, the BCH community has gone through multiple discussions from reducing block time, changing mining algorithms, adding smart contracts, etc. But none of these disputes have been well settled.
BCH is a big failure in terms of governance. A lack of good governance has made it fall in disorder. It is too decentralized to make progress.
You may know that the first BCH block was mined by ViaBTC. And we gave a lot of support to it indeed. But we didn’t dominate the fork. The Chinese community in particular thought I had a lot of influence, but it was not true.
I think the whole community is very dissatisfied with Bitcoin ABC, but it is difficult to replace them or change the status quo. So I am thinking of creating a new branch of BCH. The idea is still in early stage. I welcome anyone interested to participate and discuss it with me.
Wang: Professor Emin, what’s your attitude to fork? Do you think it’s a good timing to fork BCH?
Emin Gun Sirer: I am a big fan of BCH. It adheres to the original vision of Satoshi Nakamoto. I like the technical roadmap of BCH. But just like what Haipo mentioned, BCH lacks a good governance mechanism. There are always something that can cause BCH community to divide itself.
But I think it’s not enough to just have a good governance mechanism. There are many good proposals in the community but failed to be adopted in the end. I think BCH needs social leadership to encourage discussion when there are new proposals.
Wang: We are all curious to know How Avalanche got its name?
I know that Avalanche doesn’t mean well in Chinese. But in English, it’s a very powerful word. Avalanche represents a series of algorithms piling together like a mountain. When decisions slowly form, the ball (nodes in the network) on top of the mountain starts going down the hill on one side, and it gets bigger and bigger, and like an avalanche and it becomes unstoppable, making the transaction final.
Wang: Prof. Emin, I know that you are a big blocker. Have you ever considered implementing Avalanche based on BCH? Why create another chain?
Sirer: Of course I considered that. Satoshi Nakamoto consensus is wonderful, but the proof-of-work mechanism and Nakamoto consensus base protocols have some shortcomings, such as network latency, and it is hard to scale. Avalanche, instead, is totally different, and is the new biggest breakthrough in the past 45 years. It is flexible, fast, and scalable. I’d love to implement BCH on top of avalanche in the future, to make BCH even better by making 0-conf transactions much more secure.
Wang: As an old miner, why did CoinEx Chain choose to “abandon” POW, and turn to POS mechanism?
Haipo: Both POW and POS consensus algorithms have their own advantages. POW is not just a consensus algorithm, but also a more transparent and open distribution method of digital currency. Anyone can participate in it through mining.
POW is fairer. For a POS-based network, participants must have coins. For example, you need to invest ICO projects to obtain coins. But developers can get a lot of coins almost for free. In addition, POW is more open. Anyone can participate without holding tokens. For example, as long as you have a computer and mining rigs, you can participate in mining. Openness and fairness are two great features of POW. POS is more advanced, safe and efficient.
POS is jointly maintained by the token holders, and there is no problem of 51% attacks. Those who hold tokens are more inclined to protect the network than to destroy the network for their own interests. To disrupt the network, you need to buy at least two-thirds of the token, which is very difficult to achieve. And when you actually hold so many coins, it’s barely possible for you to destroy the network.
POW has the problem of 51% attack. For example, ETC just suffered the 51% attack on August 3. And the cost to do that is very low. It can be reorganized with only tens of thousands of dollars. This is also a defect of POW.
In addition, in terms of TPS and block speed, POS can achieve second-level speed and higher TPS. Therefore, CoinEx Chain chose POS because it can bring a faster transaction experience. This is very important for decentralized exchanges. Both POW and POS have their own advantages. It’s a matter of personal choice. When choosing a consensus mechanism, the choice must be made according to the characteristics of the specific project.
https://preview.redd.it/upbayijaaxf51.jpg?width=1055&format=pjpg&auto=webp&s=703e3b6a493a76f86bc9045e784d174bde9d3c42
Wang: Ethereum is switching to ETH 2.0. If they succeed, do you think it will lead the next bull market?
Sirer: If Ethereum 2.0 can be realized, it must be a huge success.
But I doubt it can be launched anytime soon considering that it has been constantly delayed. And even if it comes out, I am not so sure if it will address the core scaling problem. And the main technology in Ethereum 2.0 is sharding. Sharding technology divides the Ethereum networks into small parallel groups, but I think what will happen is everyone wants to be in the same “shard” so the sharding advantages might not be realizable in Ethereum 2.0.
Avalanche supports Ethereum’s virtual machine, and Avalanche can realize 1 second level confirmation, while with sharding finalizing confirmation takes 5–6 seconds at best. Avalanche approach to make Ethereum scale is superior to Ethereum 2.0. There are many big players behind Ethereum 2.0, and I wish them success. But I believe that Avalanche will be the fastest and best Smart Contract platform in the crypto space, and it is compatible with Ethereum.
Wang: Why is Avalanche a real breakthrough?
Sirer: Avalanche is fundamentally different from previous consensus mechanisms. It’s very fast with TPS surpasses 6500, which is three times that of VISA. Six confirmations can be achieved in one second. Compared with the POW mechanism of Bitcoin and Bitcoin Cash, Avalanche’s participation threshold is very low. It allows multiple virtual machines to be built on the Avalanche protocol.
Avalanche is not created to compete with Bitcoin or fiat currencies such as the US dollar and RMB. It’s not made to compete with Ethereum, which is defined as the “world’s computer”. Avalanche is positioned to be an asset issuance platform to tokenize assets in the real world.
Wang: How do you rank the importance of community, development, governance, and technology to a public chain?
Sirer: These four are like the legs of a table. Every foot is very important. The table cannot stand without strong support.
A good community needs to be open to welcome developers and people. Good governance is especially important, to figure out what users need and respect their voices. Development needs to be decentralized. Avalanche has developers all over the world. And it has big companies building on top of Avalanche.
Yang: From a long-term perspective, I think governance is the most important thing, which is the same as running a company.
In the long run, technology is not important. Blockchain technology is developed based on an open source softwares that are free to the community. Community is also not the most important factor.
I think the most important thing is governance. Decentralization is more about technical. For example, Bitcoin, through a decentralized network method, ensures the openness and transparency of data assets, and the data on the chain cannot be tampered with, ensuring that the total amount of coins has a fixed upper limit.
But at the governance level, all coins are centralized at some degree. For example, BCH developers can decide to modify the protocol. In a sense, it is the same as managing a company.
Historically, the reasons for the success and failure of companies all stem from bad governance. For example, Apple succeeded based on Steve Jobs’s charisma, leadership and the pursuit of user experience. When Jobs was kicked out, Apple suffered great losses. After Jobs returned, he made Apple great again.
Issues behind Bitmain is also about governance. Simply put, governance requires leaders who have a longer-term vision and are more capable of coordinating and balancing the resources and interests of all parties to lead the community.
In the blockchain world, many people focus on technology. In fact, technology is not enough to make great products. User experience is most important. Users don’t care about the blockchain technology itself, but more concerned about whether it is easy to use and whether it can solve my problem.
We need to figure out how to deliver a product like Apple. The pursuit of user experience is also governance in nature. And governance itself lies in the soul of key leaders in the community.
Realize tokenization of assets in.
https://preview.redd.it/14jf1bvcaxf51.jpg?width=1082&format=pjpg&auto=webp&s=c312912142c38de986f42912086e205354162190
Wang: Speaking of asset tokenization, I would like to ask Haipo, do you think the market for assets on the chain is big?
Yang: It must be very big. We need to see which assets can be tokenized.
Assets that can be tokenized are standardized assets, sush as currencies and securities.
  1. In terms of currency, Tether has issued over 10 billion U.S. dollars. Many people think that’s too much. But I think this market is underestimated. The market for stablecoins in the future must be hundreds of billions or even trillions, especially after the release of Facebook’s Libra. Even US dollar might be issued based on the blockchain in the future.
At present, the settlement of USD currency is through the SWIFT system. But the SWIFT system itself is only a clearing network, a messaging system, not a settlement network. It takes a long time for clearing and settlement, and it is not reliable. But both USDT and USDC can quickly realize cross-border transfers in seconds and realize asset delivery. Even sovereign currencies are likely to be issued on the blockchain. I believe RMB also has such a plan.
  1. Equity and securities markets are the largest market. But they have strict requirements for market access.
Whether a stock is listed on A-shares or in the American markets, it’s hard to obtain them. I believe that the blockchain can completely release the demand through decentralization. It can allow any tiny company or even a project to issue, circulate and finance a token.
There may be only tens of thousands of stocks currently traded globally. There are also tens of thousands of tokens in the crypto space. I believe that millions or more of assets will be traded and circulated in the future. This can only be realized through decentralized technology and organization.
The market for assets tokenization will be huge. And at present, the entire blockchain technology is still very primitive. Bitcoin and Ethereum only have a few or a dozen TPS, which is far from meeting market demand. This is why CoinEx is committed to building a decentralized Dex public chain.
Wang: Avalanche’s paper was first published on IPFS. What do you think of IPFS?
Sirer: I personally like IPFS very much. It is a decentralized storage solution.
Yang: There is no doubt that IPFS solves the problem of decentralized storage, and can be robust in the blockchain world, and can replace HPPT services. But there are still three problems:
  1. IPFS is not for ordinary users. Everybody needs BCH and BTC, but only developers need IPFS, which is a relatively niche market;
  2. IPFS is more expensive than traditional storage solutions, which further reduces its practicality. In order to achieve decentralization, more copies must be stored, and more hardware devices must be consumed. In the end, these costs will be on to users.
  3. There may be compliance issues. If you use IPFS to store sensitive information, such as info from WikiLeaks, it may end up threatening national security. I doubt that decentralized storage and decentralized public chains can survive under the joint pressure of global governments.
The IPFS project solves certain problems. But from the perspective of application prospects, I am pessimistic.
Wang: What do you think of Defi?
Yang: I want to talk about the concept first.
Broadly speaking, the entire blockchain industry is DeFi in nature. Blockchain is to realize the circulation of currency, equity, and asset value through decentralization.
So in a broad sense, blockchain itself is DeFi. In a narrow sense, DeFi is a financial agreement based on smart contracts. DeFi, through smart contracts, can build applications more flexibly. For example, before we could only use Bitcoin to transfer and pay. Now with smart contracts, flexible functions such as lending, exchange, mortgage , etc. are available. The entire blockchain industry is gradually evolving under the conditions of DeFi. DeFi will definitely get greater development in the future.
Sirer: I think Defi will definitely have a huge impact. DeFi is not only an innovation in the cryptocurrency field, but also an innovation in the financial field. Wall Street companies have stagnated for years with no innovation. Avalanche fits different DeFi needs, including performance and compliance. In the future, not only will Wall Street simply adopt DeFi, but DeFi will grow into a huge market that will eventually replace the traditional financial system.
Questions from the community:
1. How does Avalanche integrate with DeFi?
Sirer: At present, all DeFi applications on Avalanche have surpassed Ethereum. What can be achieved on Ethereum can be achieved on Avalanche with better user experience. We are currently connecting with popular DeFi projects such as Compound and MakerDao to add part of or all of their functions.
At present, Avalanche is working on decentralized exchange (DEX). The current DEXs are limited by speed and performance but when they are built on top of Avalanche it will be real-time and very fast.
2. How many developers does BCH have?
Yang: I think it does not matter how many developers there are. What matters is what should be developed. I watched Jobs’ video the other day, and it inspired me a lot. We are not piecing together technology to see what technology can do. It’s we figure out what we want first and then we use the technology we need.
The entire blockchain community worship developers. Such as they call Vitalik “V God”. It’s not necessary to treat developers as wizards. Developers are programmers, and I myself is also a programmer.
ViaBTC has a development team of over 100 people, including core members from Copernicus (a dev team formerly belonged to Bitmain). Technically we are very confident to build faster, stabler, and better user experience products.
submitted by CoinExcom to btc [link] [comments]

The Problem with PoW

The Problem with PoW
Miners have always had it rough..
"Frustrated Miners"

The Problem with PoW
(and what is being done to solve it)

Proof of Work (PoW) is one of the most commonly used consensus mechanisms entrusted to secure and validate many of today’s most successful cryptocurrencies, Bitcoin being one. Battle-hardened and having weathered the test of time, Bitcoin has demonstrated the undeniable strength and reliability of the PoW consensus model through sheer market saturation, and of course, its persistency.
In addition to the cost of powerful computing hardware, miners prove that they are benefiting the network by expending energy in the form of electricity, by solving and hashing away complex math problems on their computers, utilizing any suitable tools that they have at their disposal. The mathematics involved in securing proof of work revolve around unique algorithms, each with their own benefits and vulnerabilities, and can require different software/hardware to mine depending on the coin.
Because each block has a unique and entirely random hash, or “puzzle” to solve, the “work” has to be performed for each block individually and the difficulty of the problem can be increased as the speed at which blocks are solved increases.

Hashrates and Hardware Types

While proof of work is an effective means of securing a blockchain, it inherently promotes competition amongst miners seeking higher and higher hashrates due to the rewards earned by the node who wins the right to add the next block. In turn, these higher hash rates benefit the blockchain, providing better security when it’s a result of a well distributed/decentralized network of miners.
When Bitcoin first launched its genesis block, it was mined exclusively by CPUs. Over the years, various programmers and developers have devised newer, faster, and more energy efficient ways to generate higher hashrates; some by perfecting the software end of things, and others, when the incentives are great enough, create expensive specialized hardware such as ASICs (application-specific integrated circuit). With the express purpose of extracting every last bit of hashing power, efficiency being paramount, ASICs are stripped down, bare minimum, hardware representations of a specific coin’s algorithm.
This gives ASICS a massive advantage in terms of raw hashing power and also in terms of energy consumption against CPUs/GPUs, but with significant drawbacks of being very expensive to design/manufacture, translating to a high economic barrier for the casual miner. Due to the fact that they are virtual hardware representations of a single targeted algorithm, this means that if a project decides to fork and change algorithms suddenly, your powerful brand-new ASIC becomes a very expensive paperweight. The high costs in developing and manufacturing ASICs and the associated risks involved, make them unfit for mass adoption at this time.
Somewhere on the high end, in the vast hashrate expanse created between GPU and ASIC, sits the FPGA (field programmable gate array). FPGAs are basically ASICs that make some compromises with efficiency in order to have more flexibility, namely they are reprogrammable and often used in the “field” to test an algorithm before implementing it in an ASIC. As a precursor to the ASIC, FPGAs are somewhat similar to GPUs in their flexibility, but require advanced programming skills and, like ASICs, are expensive and still fairly uncommon.

2 Guys 1 ASIC

One of the issues with proof of work incentivizing the pursuit of higher hashrates is in how the network calculates block reward coinbase payouts and rewards miners based on the work that they have submitted. If a coin generated, say a block a minute, and this is a constant, then what happens if more miners jump on a network and do more work? The network cannot pay out more than 1 block reward per 1 minute, and so a difficulty mechanism is used to maintain balance. The difficulty will scale up and down in response to the overall nethash, so if many miners join the network, or extremely high hashing devices such as ASICs or FPGAs jump on, the network will respond accordingly, using the difficulty mechanism to make the problems harder, effectively giving an edge to hardware that can solve them faster, balancing the network. This not only maintains the block a minute reward but it has the added side-effect of energy requirements that scale up with network adoption.
Imagine, for example, if one miner gets on a network all alone with a CPU doing 50 MH/s and is getting all 100 coins that can possibly be paid out in a day. Then, if another miner jumps on the network with the same CPU, each miner would receive 50 coins in a day instead of 100 since they are splitting the required work evenly, despite the fact that the net electrical output has doubled along with the work. Electricity costs miner’s money and is a factor in driving up coin price along with adoption, and since more people are now mining, the coin is less centralized. Now let’s say a large corporation has found it profitable to manufacture an ASIC for this coin, knowing they will make their money back mining it or selling the units to professionals. They join the network doing 900 MH/s and will be pulling in 90 coins a day, while the two guys with their CPUs each get 5 now. Those two guys aren’t very happy, but the corporation is. Not only does this negatively affect the miners, it compromises the security of the entire network by centralizing the coin supply and hashrate, opening the doors to double spends and 51% attacks from potential malicious actors. Uncertainty of motives and questionable validity in a distributed ledger do not mix.
When technology advances in a field, it is usually applauded and welcomed with open arms, but in the world of crypto things can work quite differently. One of the glaring flaws in the current model and the advent of specialized hardware is that it’s never ending. Suppose the two men from the rather extreme example above took out a loan to get themselves that ASIC they heard about that can get them 90 coins a day? When they join the other ASIC on the network, the difficulty adjusts to keep daily payouts consistent at 100, and they will each receive only 33 coins instead of 90 since the reward is now being split three ways. Now what happens if a better ASIC is released by that corporation? Hopefully, those two guys were able to pay off their loans and sell their old ASICs before they became obsolete.
This system, as it stands now, only perpetuates a never ending hashrate arms race in which the weapons of choice are usually a combination of efficiency, economics, profitability and in some cases control.

Implications of Centralization

This brings us to another big concern with expensive specialized hardware: the risk of centralization. Because they are so expensive and inaccessible to the casual miner, ASICs and FPGAs predominantly remain limited to a select few. Centralization occurs when one small group or a single entity controls the vast majority hash power and, as a result, coin supply and is able to exert its influence to manipulate the market or in some cases, the network itself (usually the case of dishonest nodes or bad actors).
This is entirely antithetical of what cryptocurrency was born of, and since its inception many concerted efforts have been made to avoid centralization at all costs. An entity in control of a centralized coin would have the power to manipulate the price, and having a centralized hashrate would enable them to affect network usability, reliability, and even perform double spends leading to the demise of a coin, among other things.
The world of crypto is a strange new place, with rapidly growing advancements across many fields, economies, and boarders, leaving plenty of room for improvement; while it may feel like a never-ending game of catch up, there are many talented developers and programmers working around the clock to bring us all more sustainable solutions.

The Rise of FPGAs

With the recent implementation of the commonly used coding language C++, and due to their overall flexibility, FPGAs are becoming somewhat more common, especially in larger farms and in industrial setting; but they still remain primarily out of the hands of most mining enthusiasts and almost unheard of to the average hobby miner. Things appear to be changing though, one example of which I’ll discuss below, and it is thought by some, that soon we will see a day when mining with a CPU or GPU just won’t cut it any longer, and the market will be dominated by FPGAs and specialized ASICs, bringing with them efficiency gains for proof of work, while also carelessly leading us all towards the next round of spending.
A perfect real-world example of the effect specialized hardware has had on the crypto-community was recently discovered involving a fairly new project called VerusCoin and a fairly new, relatively more economically accessible FPGA. The FPGA is designed to target specific alt-coins whose algo’s do not require RAM overhead. It was discovered the company had released a new algorithm, kept secret from the public, which could effectively mine Verus at 20x the speed of GPUs, which were the next fastest hardware types mining on the Verus network.
Unfortunately this was done with a deliberately secret approach, calling the Verus algorithm “Algo1” and encouraging owners of the FPGA to never speak of the algorithm in public channels, admonishing a user when they did let the cat out of the bag. The problem with this business model is that it is parasitic in nature. In an ecosystem where advancements can benefit the entire crypto community, this sort of secret mining approach also does not support the philosophies set forth by the Bitcoin or subsequent open source and decentralization movements.
Although this was not done in the spirit of open source, it does hint to an important step in hardware innovation where we could see more efficient specialized systems within reach of the casual miner. The FPGA requires unique sets of data called a bitstream in order to be able to recognize each individual coin’s algorithm and mine them. Because it’s reprogrammable, with the support of a strong development team creating such bitstreams, the miner doesn’t end up with a brick if an algorithm changes.

All is not lost thanks to.. um.. Technology?

Shortly after discovering FPGAs on the network, the Verus developers quickly designed, tested, and implemented a new, much more complex and improved algorithm via a fork that enabled Verus to transition smoothly from VerusHash 1.0 to VerusHash 2.0 at block 310,000. Since the fork, VerusHash 2.0 has demonstrated doing exactly what it was designed for- equalizing hardware performance relative to the device being used while enabling CPUs (the most widely available “ASICs”) to mine side by side with GPUs, at a profit and it appears this will also apply to other specialized hardware. This is something no other project has been able to do until now. Rather than pursue the folly of so many other projects before it- attempting to be “ASIC proof”, Verus effectively achieved and presents to the world an entirely new model of “hardware homogeny”. As the late, great, Bruce Lee once said- “Don’t get set into one form, adapt it and build your own, and let it grow, be like water.”
In the design of VerusHash 2.0, Verus has shown it doesn’t resist progress like so many other new algorithms try to do, it embraces change and adapts to it in the way that water becomes whatever vessel it inhabits. This new approach- an industry first- could very well become an industry standard and in doing so, would usher in a new age for proof of work based coins. VerusHash 2.0 has the potential to correct the single largest design flaw in the proof of work consensus mechanism- the ever expanding monetary and energy requirements that have plagued PoW based projects since the inception of the consensus mechanism. Verus also solves another major issue of coin and net hash centralization by enabling legitimate CPU mining, offering greater coin and hashrate distribution.
Digging a bit deeper it turns out the Verus development team are no rookies. The lead developer Michael F Toutonghi has spent decades in the field programming and is a former Vice President and Technical Fellow at Microsoft, recognized founder and architect of Microsoft's .Net platform, ex-Technical Fellow of Microsoft's advertising platform, ex-CTO, Parallels Corporation, and an experienced distributed computing and machine learning architect. The project he helped create employs and makes use of a diverse myriad of technologies and security features to form one of the most advanced and secure cryptocurrency to date. A brief description of what makes VerusCoin special quoted from a community member-
"Verus has a unique and new consensus algorithm called Proof of Power which is a 50% PoW/50% PoS algorithm that solves theoretical weaknesses in other PoS systems (Nothing at Stake problem for example) and is provably immune to 51% hash attacks. With this, Verus uses the new hash algorithm, VerusHash 2.0. VerusHash 2.0 is designed to better equalize mining across all hardware platforms, while favoring the latest CPUs over older types, which is also one defense against the centralizing potential of botnets. Unlike past efforts to equalize hardware hash-rates across different hardware types, VerusHash 2.0 explicitly enables CPUs to gain even more power relative to GPUs and FPGAs, enabling the most decentralizing hardware, CPUs (due to their virtually complete market penetration), to stay relevant as miners for the indefinite future. As for anonymity, Verus is not a "forced private", allowing for both transparent and shielded (private) transactions...and private messages as well"

If other projects can learn from this and adopt a similar approach or continue to innovate with new ideas, it could mean an end to all the doom and gloom predictions that CPU and GPU mining are dead, offering a much needed reprieve and an alternative to miners who have been faced with the difficult decision of either pulling the plug and shutting down shop or breaking down their rigs to sell off parts and buy new, more expensive hardware…and in so doing present an overall unprecedented level of decentralization not yet seen in cryptocurrency.
Technological advancements led us to the world of secure digital currencies and the progress being made with hardware efficiencies is indisputably beneficial to us all. ASICs and FPGAs aren’t inherently bad, and there are ways in which they could be made more affordable and available for mass distribution. More than anything, it is important that we work together as communities to find solutions that can benefit us all for the long term.

In an ever changing world where it may be easy to lose sight of the real accomplishments that brought us to this point one thing is certain, cryptocurrency is here to stay and the projects that are doing something to solve the current problems in the proof of work consensus mechanism will be the ones that lead us toward our collective vision of a better world- not just for the world of crypto but for each and every one of us.
submitted by Godballz to CryptoCurrency [link] [comments]

AN INTRODUCTION TO DIGIBYTE

DigiByte

What are cryptocurrencies?
Cryptocurrencies are peer to peer technology protocols which rely on the block-chain; a system of decentralized record keeping which allows people to exchange unmodifiable and indestructible information “coins,” globally in little to no time with little to no fees – this translates into the exchange of value as these coins cannot be counterfeit nor stolen. This concept was started by Satoshi Nakamoto (allegedly a pseudonym for a single man or organization) whom described and coded Bitcoin in 2009.
What is DigiByte?
DigiByte (DGB) is a cryptocurrency like Bitcoin. It is also a decentralized applications protocol in a similar fashion to Neo or Ethereum.
DigiByte was founded and created by Jared Tate in 2014. DigiByte allows for fast (virtually instant) and low cost (virtually free) transactions. DigiByte is hard capped at 21 billion coins which will ever be mined, over a period of 21 years. DigiByte was never an ICO and was mined/created in the same way that Bitcoin or Litecoin initially were.
DigiByte is the fastest UTXO PoW scalable block-chain in the world. We’ll cover what this really means down below.
DigiByte has put forth and applied solutions to many of the problems that have plagued Bitcoin and cryptocurrencies in general – those being:
We will address these point by point in the subsequent sections.
The DigiByte Protocol
DigiByte maintains these properties through use of various technological innovations which we will briefly address below.
Why so many coins? 21 Billion
When initially conceived Bitcoin was the first of a kind! And came into the hands of a few! The beginnings of a coin such as Bitcoin were difficult, it had to go through a lot of initial growth pains which following coins did not have to face. It is for this reason among others why I believe Bitcoin was capped at 21 million; and why today it has thus secured a place as digital gold.
When Bitcoin was first invented no one knew anything about cryptocurrencies, for the inventor to get them out to the public he would have to give them away. This is how the first Bitcoins were probably passed on, for free! But then as interest grew so did the community. For them to be able to build something and create something which could go on to have actual value, it would have to go through a steady growth phase. Therefore, the control of inflation through mining was extremely important. Also, why the cap for Bitcoin was probably set so low - to allow these coins to amass value without being destroyed by inflation (from mining) in the same way fiat is today! In my mind Satoshi Nakamoto knew what he was doing when setting it at 21 million BTC and must have known and even anticipated others would take his design and build on top of it.
At DigiByte, we are that better design and capped at 21 billion. That's 1000 times larger than the supply of Bitcoin. Why though? Why is the cap on DigiByte so much higher than that of Bitcoin? Because DigiByte was conceived to be used not as a digital gold, nor as any sort of commodity, but as a real currency!
Today on planet Earth, we are approximately 7.6 billion people. If each person should want or need to use and live off Bitcoin; then equally split at best each person could only own 0.00276315789 BTC. The market cap for all the money on the whole planet today is estimated to have recently passed 80 trillion dollars. That means that each whole unit of Bitcoin would be worth approximately $3,809,523.81!
$3,809,523.81
This is of course in an extreme case where everyone used Bitcoin for everything. But even in a more conservative scenario the fact remains that with such a low supply each unit of a Bitcoin would become absurdly expensive if not inaccessible to most. Imagine trying to buy anything under a dollar!
Not only would using Bitcoin as an everyday currency be a logistical nightmare but it would be nigh impossible. For each Satoshi of a Bitcoin would be worth much, much, more than what is realistically manageable.
This is where DigiByte comes in and where it shines. DigiByte aims to be used world-wide as an international currency! Not to be hoarded in the same way Bitcoin is. If we were to do some of the same calculations with DigiByte we'd find that the numbers are a lot more reasonable.
At 7.6 billion people, each person could own 2.76315789474 DGB. Each whole unit of DGB would be worth approximately $3,809.52.
$3,809.52
This is much more manageable and remember in an extreme case where everyone used DigiByte for everything! I don't expect this to happen anytime soon, but with the supply of DigiByte it would allow us to live and transact in a much more realistic and fluid fashion. Without having to divide large numbers on our phone's calculator to understand how much we owe for that cup of coffee! With DigiByte it's simple, coffee cost 1.5 DGB, the cinema 2.8 DGB, a plane ticket 500 DGB!
There is a reason for DigiByte's large supply, and it is a good one!
Decentralisation
Decentralisation is an important concept for the block-chain and cryptocurrencies in general. This allows for a system which cannot be controlled nor manipulated no matter how large the organization in play or their intentions. DigiByte’s chain remains out of the reach of even the most powerful government. This allows for people to transact freely and openly without fear of censorship.
Decentralisation on the DigiByte block-chain is assured by having an accessible and fair mining protocol in place – this is the multi-algorithm (MultiAlgo) approach. We believe that all should have access to DigiByte whether through purchase or by mining. Therefore, DigiByte is minable not only on dedicated mining hardware such as Antminers, but also through use of conventional graphics cards. The multi-algorithm approach allows for users to mine on a variety of hardware types through use of one of the 5 mining algorithms supported by DigiByte. Those being:
Please note that these mining algorithms are modified and updated from time to time to assure complete decentralisation and thus ultimate security.
The problem with using only one mining algorithm such as Bitcoin or Litecoin do is that this allows for people to continually amass mining hardware and hash power. The more hash power one has, the more one can collect more. This leads to a cycle of centralisation and the creation of mining centres. It is known that a massive portion of all hash power in Bitcoin comes from China. This kind of centralisation is a natural tendency as it is cheaper for large organisations to set up in countries with inexpensive electricity and other such advantages which may be unavailable to the average miner.
DigiByte mitigates this problem with the use of multiple algorithms. It allows for miners with many different kinds of hardware to mine the same coin on an even playing field. Mining difficulty is set relative to the mining algorithm used. This allows for those with dedicated mining rigs to mine alongside those with more modest machines – and all secure the DigiByte chain while maintaining decentralisation.
Low Fees
Low fees are maintained in DigiByte thanks to the MultiAlgo approach working in conjunction with MultiShield (originally known as DigiShield). MultiShield calls for block difficulty readjustment between every single block on the chain; currently blocks last 15 seconds. This continuous difficulty readjustment allows us to combat any bad actors which may wish to manipulate the DigiByte chain.
Manipulation may be done by a large pool or a single entity with a great amount of hash power mining blocks on the chain; thus, increasing the difficulty of the chain. In some coins such as Bitcoin or Litecoin difficulty is readjusted every 2016 blocks at approximately 10mins each and 2mins respectively. Meaning that Bitcoin’s difficulty is readjusted about every two weeks. This system can allow for large bad actors to mine a coin and then abandon it, leaving it with a difficulty level far too high for the present hash rate – and so transactions can be frozen, and the chain stopped until there is a difficulty readjustment and or enough hash power to mine the chain. In such a case users may be faced with a choice - pay exorbitant fees or have their transactions frozen. In an extreme case the whole chain could be frozen completely for extended periods of time.
DigiByte does not face this problem as its difficulty is readjusted per block every 15 seconds. This innovation was a technological breakthrough and was adopted by several other coins in the cryptocurrency environment such as Dogecoin, Z-Cash, Ubiq, Monacoin, and Bitcoin Gold.
This difficulty readjustment along with the MultiAlgo approach allows DigiByte to maintain the lowest fees of any UTXO – PoW – chain in the world. Currently fees on the DigiByte block-chain are at about 0.0001 DGB per transaction of 100 000 DGB sent. This depends on the amount sent and currently 100 000 DGB are worth around $2000.00 with the fee being less than 0.000002 cents. It would take 500 000 transactions of 100 000 DGB to equal 1 penny’s worth. This was tested on a Ledger Nano S set to the low fees setting.
Fast transaction times
Fast transactions are ensured by the conjunctive use of the two aforementioned technology protocols. The use of MultiShield and MultiAlgo allows the mining of the DigiByte chain to always be profitable and thus there is always someone mining your transactions. MultiAlgo allows there to a greater amount of hash power spread world-wide, this along with 15 second block times allows for transactions to be near instantaneous. This speed is also ensured by the use DigiSpeed. DigiSpeed is the protocol by which the DigiByte chain will decrease block timing gradually. Initially DigiByte started with 30 second block times in 2014; which today are set at 15 seconds. This decrease will allow for ever faster and ever more transactions per block.
Robust security + The Immutable Ledger
At the core of cryptocurrency security is decentralisation. As stated before decentralisation is ensured on the DigiByte block chain by use of the MultiAlgo approach. Each algorithm in the MultiAlgo approach of DigiByte is only allowed about 20% of all new blocks. This in conjunction with MultiShield allows for DigiByte to be the most secure, most reliable, and fastest UTXO block chain on the planet. This means that DigiByte is a proof of work (PoW) block-chain where all transactional activities are stored on the immutable public ledger world-wide. In DigiByte there is no need for the Lightning protocol (although we have it) nor sidechains to scale, and thus we get to keep PoW’s security.
There are many great debates as to the robustness or cleanliness of PoW. The fact remains that PoW block-chains remain the only systems in human history which have never been hacked and thus their security is maximal.
For an attacker to divert the DigiByte chain they would need to control over 93% of all the hashrate on one algorithm and 51% of the other four. And so DigiByte is immune to the infamous 51% attack to which Bitcoin and Litecoin are vulnerable.
Moreover, the DigiByte block-chain is currently spread over 200 000 plus servers, computers, phones, and other machines world-wide. The fact is that DigiByte is one of the easiest to mine coins there is – this is greatly aided by the recent release of the one click miner. This allows for ever greater decentralisation which in turn assures that there is no single point of failure and the chain is thus virtually un-attackable.
On Chain Scalability
The biggest barrier for block-chains today is scalability. Visa the credit card company can handle around 2000 transactions per second (TPS) today. This allows them to ensure customer security and transactional rates nation-wide. Bitcoin currently sits at around 7 TPS and Litecoin at 28 TPS (56 TPS with SegWit). All the technological innovations I’ve mentioned above come together to allow for DigiByte to be the fastest PoW block-chain in the world and the most scalable.
DigiByte is scalable because of DigiSpeed, the protocol through which block times are decreased and block sizes are increased. It is known that a simple increase in block size can increase the TPS of any block-chain, such is the case with Bitcoin Cash. This is however not scalable. The reason a simple increase in block size is not scalable is because it would eventually lead to some if not a great amount of centralization. This centralization occurs because larger block sizes mean that storage costs and thus hardware cost for miners increases. This increase along with full blocks – meaning many transactions occurring on the chain – will inevitably bar out the average miner after difficulty increases and mining centres consolidate.
Hardware cost, and storage costs decrease over time following Moore’s law and DigiByte adheres to it perfectly. DigiSpeed calls for the increase in block sizes and decrease in block timing every two years by a factor of two. This means that originally DigiByte’s block sizes were 1 MB at 30 seconds each at inception in 2014. In 2016 DigiByte increased block size by two and decreased block timing by the same factor. Perfectly following Moore’s law. Moore’s law dictates that in general hardware increases in power by a factor of two while halving in cost every year.
This would allow for DigiByte to scale at a steady rate and for people to adopt new hardware at an equally steady rate and reasonable expense. Thus so, the average miner can continue to mine DigiByte on his algorithm of choice with entry level hardware.
DigiByte was one of the first block chains to adopt segregated witness (SegWit in 2017) a protocol whereby a part of transactional data is removed and stored elsewhere to decrease transaction data weight and thus increase scalability and speed. This allows us to fit more transactions per block which does not increase in size!
DigiByte currently sits at 560 TPS and could scale to over 280 000 TPS by 2035. This dwarfs any of the TPS capacities; even projected/possible capacities of some coins and even private companies. In essence DigiByte could scale worldwide today and still be reliable and robust. DigiByte could even handle the cumulative transactions of all the top 50 coins in coinmarketcap.com and still run smoothly and below capacity. In fact, to max out DigiByte’s actual maximum capacity (today at 560 TPS) you would have to take all these transactions and multiply them by a factor of 10!
Oher Uses for DigiByte
Note that DigiByte is not only to be used as a currency. Its immense robustness, security and scalability make it ideal for building decentralised applications (DAPPS) which it can host. DigiByte can in fact host DAPPS and even centralised versions which rely on the chain which are known as Digi-Apps. This application layer is also accompanied by a smart contract layer.
Thus, DigiByte could host several Crypto Kitties games and more without freezing out or increasing transaction costs for the end user.
Currently there are various DAPPS being built on the DigiByte block-chain, these are done independently of the DigiByte core team. These companies are simply using the DigiByte block-chain as a utility much in the same way one uses a road to get to work. One such example is Loly – a Tinderesque consensual dating application.
DigiByte also hosts a variety of other platform projects such as the following:
The DigiByte Foundation
As previously mentioned DigiByte was not an ICO. The DigiByte foundation was established in 2017 by founder Jared Tate. Its purpose is as a non-profit organization dedicated to supporting and developing the DigiByte block-chain.
DigiByte is a community effort and a community coin, to be treated as a public resource as water or air. Know that anyone can work on DigiByte, anyone can create, and do as they wish. It is a permissionless system which encourages innovation and creation. If you have an idea and or would like to get help on your project do not hesitate to contact the DigiByte foundation either through the official website and or the telegram developer’s channel.
For this reason, it is ever more important to note that the DigiByte foundation cannot exist without public support. And so, this is the reason I encourage all to donate to the foundation. All funds are used for the maintenance of DigiByte servers, marketing, and DigiByte development.
DigiByte Resources and Websites
DigiByte
Wallets
Explorers
Please refer to the sidebar of this sub-reddit for more resources and information.
Edit - Removed Jaxx wallet.
Edit - A new section was added to the article: Why so many coins? 21 Billion
Edit - Adjusted max capacity of DGB's TPS - Note it's actually larger than I initially calculated.
Edit – Grammar and format readjustment
Hello,
I hope you’ve enjoyed my article, I originally wrote this for the reddit sub-wiki where it generally will most likely, probably not, get a lot of attention. So instead I've decided to make this sort of an introductory post, an open letter, to any newcomers to DGB or for those whom are just curious.
I tried to cover every aspect of DGB, but of course I may have forgotten something! Please leave a comment down below and tell me why you're in DGB? What convinced you? Me it's the decentralised PoW that really convinced me. Plus, just that transaction speed and virtually no fees! Made my mouth water!
-Dereck de Mézquita
I'm a student typing this stuff on my free time, help me pay my debts? Thank you!
D64fAFQvJMhrBUNYpqUKQjqKrMLu76j24g
https://digiexplorer.info/address/D64fAFQvJMhrBUNYpqUKQjqKrMLu76j24g
submitted by xeno_biologist to Digibyte [link] [comments]

FYI How to Pi Coin Cryptocurrency Mining Made Simple for Everyday People (Passive Income)

FYI How to Pi Coin Cryptocurrency Mining Made Simple for Everyday People (Passive Income)

Pi Coin Cryptocurrency Mining Made Simple for Everyday People (Passive Income)

https://preview.redd.it/kxygj89rsej31.jpg?width=1236&format=pjpg&auto=webp&s=a27bf488767eb2a119d38e1f4120a280da692cf9

Combining Cryptocurrency Mining with Traveling

Chances are, you’ve heard about the buzzing prices of cryptocurrencies like Bitcoin but probably know jack about anything else of it. Understanding how Bitcoin and Blockchain technology works can be a bit intimidating and overwhelming for new users into the Crypto-space. But hey, I’m no expert either and I just dabble in it to know enough not to miss out.
As someone who aspires to become a digital nomad and travel the world, you may want to consider looking to start now. And here I can show you how easy it is to start with zero knowledge of Bitcoin, Blockchain, and cryptocurrency.
If you know how to download an app to your smartphone or tablet. Then you are ready to earn cryptocurrency. You use your smartphone for pretty much everything from communicating, paying bills, watching videos on YouTube all day, looking for restaurants and etc. So why not also use it to make you money every single day and doesn’t cost you anything more but to just download an app?
The tides are ever-changing, and to survive, you must sail with the wind. Not against it. Don’t leave free money on the table and read on.
Pi Network (Minepi.com) is getting increasingly popular every day and is perhaps one of the fastest-growing networks in the Cryptocurrency world as of 2019. It’s no wonder you would want to get as many members under you as quickly as possible.
Mining for Pi Coin cryptocurrency takes almost no effort but to simply make a single tap on the lightning symbol once a day. After that, you can continue to use your phone as normal or shut the screen off. The Pi coin is continuously mined for the next 24 hours in the background of your device without using massive amounts of energy unlike mining for Bitcoin.
As the Pi Network continues to grow with 1000’s of new miners signing up every day, the mining rate will be halved accordingly, which means it’ll be harder and slower to get Pi coins. So, it’s quite obvious to be one of those people to get a really early start on this popular rising cryptocurrency.
But why am I sharing my secrets? Aren’t you afraid of the competition? No, not really. Because for 1, I am a believer in Pi vision so I want to help it grow by helping you grow. Plus, even if this information is out there, I doubt that everyone who reads this will actually put in the effort to actually grow (please prove me wrong).
Please read on and learn the secret!
24h a day 7d a week
168h a week 4 weeks a month
672h a month
If the PI coin is listed on markets and the price is at $0,05 you have passive income from $336. If you mine passive 1 PI a hour. Its only one click on your mobile a day.

What Is Pi Network?

Pi Network is a small connected group of people within a security circle stitched along with other smaller security circles to create a “trust graph” that will help users know who to trust and transact with. The security circle is used to validate one’s identity to allow seamlessly and trusted transactions in the Pi cryptocurrency marketplace. This is to secure trust in the network that no fraudulent activities can take place.

The Core Team Members

Dr. Nicolas KokkalisHead of Technology Stanford Ph.D. and instructor of Stanford’s first decentralized applications class; combining distributed systems and human-computer interactions to bring cryptocurrency to everyday people.
Dr. Chengdiao Fan – Head of Product Stanford Ph.D. in Computational Anthropology harnessing social computing to unlock human potential on a global scale.
Vincent McPhillip – Head of Community Yale and Stanford-trained social movement builder on a mission to democratize how society defines, creates, and distributes wealth.

  • Their Mission is to build a cryptocurrency and smart contracts platform secured and operated by everyday people like us, but with simplicity.
  • Their vision is to make Cryptomining and spending as easy as “Pie”, making it the world’s most inclusive peer-to-peer marketplace that is fueled by Pi.
· The network has about 70k daily active members mining every day on the Pi Network app and is growing incredibly fast. Once they reach the first threshold of 100k users, they will HALVE the mining rate. That’s right, so mining speed for you will become lesser and lesser as more users join the network. However, if you join today and become a (PI)oneer and help steer the Pi network into the right general direction. You’ll get a larger piece of the PI(E) and grow it into massive savings until it hits the exchange market.
· They are going to release an update during Q4 2019 that will enable users to send their Pi coins to any other user on the network. This will be the beginning of seeing the true value of the coin. Personally, for me, I’m all in for the what if factor, especially when all it took was to download a single app.
· You don’t have to spend any extra money to mine so long as you have a smartphone or a tablet.

· Invitation Into The Pi Network

· Currently, Pi network is in beta and to join the network is through an invitation code only by someone who is already a member. You can join under my name and be added into my security circle.

What Can You Do with Your Pi Coins?

In the future:


  • Pioneers can wager Pi to engage the attention of other members of the network, by sharing content (e.g., text, images, videos) or asking questions
  • Trade Pi coins with other members
  • Use Pi to purchase goods in the Pi marketplace
  • Use Pi for advertisement
  • Exchange Pi for ETH or BTC which can be exchanged into fiat money
  • Much more as the network grows to mass adoption
As they are in beta now, you get to mine Pi coins at higher rates since there are fewer users on the network. Like Bitcoin for example; mining Bitcoin 10 years ago you would be able to get a few Bitcoin every hour or so. Now, you’ll get somewhere around 0.000001 BTC per hour since there are millions of miners now (mining rate differs for everyone depending on their rig).
Think of Pi like in the early stages of Bitcoin when nobody really knew or understood what its technology is really all about. Imagine if you knew then what you know now. Wouldn’t you mine it like crazy? It goes exactly the same for Pi coin.

Need To Know More?

https://minepi.com/white-paper https://minepi.com/faq
Comment down below if you joined so I can add you into my security circle.
Thanks for reading, You Like? Share It!
submitted by SandraThelenhere to PiNetworkMining [link] [comments]

The Problem with PoW


Miners have always had it rough..
"Frustrated Miners"


The Problem with PoW
(and what is being done to solve it)

Proof of Work (PoW) is one of the most commonly used consensus mechanisms entrusted to secure and validate many of today’s most successful cryptocurrencies, Bitcoin being one. Battle-hardened and having weathered the test of time, Bitcoin has demonstrated the undeniable strength and reliability of the PoW consensus model through sheer market saturation, and of course, its persistency.
In addition to the cost of powerful computing hardware, miners prove that they are benefiting the network by expending energy in the form of electricity, by solving and hashing away complex math problems on their computers, utilizing any suitable tools that they have at their disposal. The mathematics involved in securing proof of work revolve around unique algorithms, each with their own benefits and vulnerabilities, and can require different software/hardware to mine depending on the coin.
Because each block has a unique and entirely random hash, or “puzzle” to solve, the “work” has to be performed for each block individually and the difficulty of the problem can be increased as the speed at which blocks are solved increases.
Hashrates and Hardware Types
While proof of work is an effective means of securing a blockchain, it inherently promotes competition amongst miners seeking higher and higher hashrates due to the rewards earned by the node who wins the right to add the next block. In turn, these higher hash rates benefit the blockchain, providing better security when it’s a result of a well distributed/decentralized network of miners.
When Bitcoin first launched its genesis block, it was mined exclusively by CPUs. Over the years, various programmers and developers have devised newer, faster, and more energy efficient ways to generate higher hashrates; some by perfecting the software end of things, and others, when the incentives are great enough, create expensive specialized hardware such as ASICs (application-specific integrated circuit). With the express purpose of extracting every last bit of hashing power, efficiency being paramount, ASICs are stripped down, bare minimum, hardware representations of a specific coin’s algorithm.
This gives ASICS a massive advantage in terms of raw hashing power and also in terms of energy consumption against CPUs/GPUs, but with significant drawbacks of being very expensive to design/manufacture, translating to a high economic barrier for the casual miner. Due to the fact that they are virtual hardware representations of a single targeted algorithm, this means that if a project decides to fork and change algorithms suddenly, your powerful brand-new ASIC becomes a very expensive paperweight. The high costs in developing and manufacturing ASICs and the associated risks involved, make them unfit for mass adoption at this time.
Somewhere on the high end, in the vast hashrate expanse created between GPU and ASIC, sits the FPGA (field programmable gate array). FPGAs are basically ASICs that make some compromises with efficiency in order to have more flexibility, namely they are reprogrammable and often used in the “field” to test an algorithm before implementing it in an ASIC. As a precursor to the ASIC, FPGAs are somewhat similar to GPUs in their flexibility, but require advanced programming skills and, like ASICs, are expensive and still fairly uncommon.
2 Guys 1 ASIC
One of the issues with proof of work incentivizing the pursuit of higher hashrates is in how the network calculates block reward coinbase payouts and rewards miners based on the work that they have submitted. If a coin generated, say a block a minute, and this is a constant, then what happens if more miners jump on a network and do more work? The network cannot pay out more than 1 block reward per 1 minute, and so a difficulty mechanism is used to maintain balance. The difficulty will scale up and down in response to the overall nethash, so if many miners join the network, or extremely high hashing devices such as ASICs or FPGAs jump on, the network will respond accordingly, using the difficulty mechanism to make the problems harder, effectively giving an edge to hardware that can solve them faster, balancing the network. This not only maintains the block a minute reward but it has the added side-effect of energy requirements that scale up with network adoption.
Imagine, for example, if one miner gets on a network all alone with a CPU doing 50 MH/s and is getting all 100 coins that can possibly be paid out in a day. Then, if another miner jumps on the network with the same CPU, each miner would receive 50 coins in a day instead of 100 since they are splitting the required work evenly, despite the fact that the net electrical output has doubled along with the work. Electricity costs miner’s money and is a factor in driving up coin price along with adoption, and since more people are now mining, the coin is less centralized. Now let’s say a large corporation has found it profitable to manufacture an ASIC for this coin, knowing they will make their money back mining it or selling the units to professionals. They join the network doing 900 MH/s and will be pulling in 90 coins a day, while the two guys with their CPUs each get 5 now. Those two guys aren’t very happy, but the corporation is. Not only does this negatively affect the miners, it compromises the security of the entire network by centralizing the coin supply and hashrate, opening the doors to double spends and 51% attacks from potential malicious actors. Uncertainty of motives and questionable validity in a distributed ledger do not mix.
When technology advances in a field, it is usually applauded and welcomed with open arms, but in the world of crypto things can work quite differently. One of the glaring flaws in the current model and the advent of specialized hardware is that it’s never ending. Suppose the two men from the rather extreme example above took out a loan to get themselves that ASIC they heard about that can get them 90 coins a day? When they join the other ASIC on the network, the difficulty adjusts to keep daily payouts consistent at 100, and they will each receive only 33 coins instead of 90 since the reward is now being split three ways. Now what happens if a better ASIC is released by that corporation? Hopefully, those two guys were able to pay off their loans and sell their old ASICs before they became obsolete.
This system, as it stands now, only perpetuates a never ending hashrate arms race in which the weapons of choice are usually a combination of efficiency, economics, profitability and in some cases control.
Implications of Centralization
This brings us to another big concern with expensive specialized hardware: the risk of centralization. Because they are so expensive and inaccessible to the casual miner, ASICs and FPGAs predominantly remain limited to a select few. Centralization occurs when one small group or a single entity controls the vast majority hash power and, as a result, coin supply and is able to exert its influence to manipulate the market or in some cases, the network itself (usually the case of dishonest nodes or bad actors).
This is entirely antithetical of what cryptocurrency was born of, and since its inception many concerted efforts have been made to avoid centralization at all costs. An entity in control of a centralized coin would have the power to manipulate the price, and having a centralized hashrate would enable them to affect network usability, reliability, and even perform double spends leading to the demise of a coin, among other things.
The world of crypto is a strange new place, with rapidly growing advancements across many fields, economies, and boarders, leaving plenty of room for improvement; while it may feel like a never-ending game of catch up, there are many talented developers and programmers working around the clock to bring us all more sustainable solutions.
The Rise of FPGAs
With the recent implementation of the commonly used coding language C++, and due to their overall flexibility, FPGAs are becoming somewhat more common, especially in larger farms and in industrial setting; but they still remain primarily out of the hands of most mining enthusiasts and almost unheard of to the average hobby miner. Things appear to be changing though, one example of which I’ll discuss below, and it is thought by some, that soon we will see a day when mining with a CPU or GPU just won’t cut it any longer, and the market will be dominated by FPGAs and specialized ASICs, bringing with them efficiency gains for proof of work, while also carelessly leading us all towards the next round of spending.
A perfect real-world example of the effect specialized hardware has had on the crypto-community was recently discovered involving a fairly new project called VerusCoin and a fairly new, relatively more economically accessible FPGA. The FPGA is designed to target specific alt-coins whose algo’s do not require RAM overhead. It was discovered the company had released a new algorithm, kept secret from the public, which could effectively mine Verus at 20x the speed of GPUs, which were the next fastest hardware types mining on the Verus network.
Unfortunately this was done with a deliberately secret approach, calling the Verus algorithm “Algo1” and encouraging owners of the FPGA to never speak of the algorithm in public channels, admonishing a user when they did let the cat out of the bag. The problem with this business model is that it is parasitic in nature. In an ecosystem where advancements can benefit the entire crypto community, this sort of secret mining approach also does not support the philosophies set forth by the Bitcoin or subsequent open source and decentralization movements.
Although this was not done in the spirit of open source, it does hint to an important step in hardware innovation where we could see more efficient specialized systems within reach of the casual miner. The FPGA requires unique sets of data called a bitstream in order to be able to recognize each individual coin’s algorithm and mine them. Because it’s reprogrammable, with the support of a strong development team creating such bitstreams, the miner doesn’t end up with a brick if an algorithm changes.
All is not lost thanks to.. um.. Technology?
Shortly after discovering FPGAs on the network, the Verus developers quickly designed, tested, and implemented a new, much more complex and improved algorithm via a fork that enabled Verus to transition smoothly from VerusHash 1.0 to VerusHash 2.0 at block 310,000. Since the fork, VerusHash 2.0 has demonstrated doing exactly what it was designed for- equalizing hardware performance relative to the device being used while enabling CPUs (the most widely available “ASICs”) to mine side by side with GPUs, at a profit and it appears this will also apply to other specialized hardware. This is something no other project has been able to do until now. Rather than pursue the folly of so many other projects before it- attempting to be “ASIC proof”, Verus effectively achieved and presents to the world an entirely new model of “hardware homogeny”. As the late, great, Bruce Lee once said- “Don’t get set into one form, adapt it and build your own, and let it grow, be like water.”
In the design of VerusHash 2.0, Verus has shown it doesn’t resist progress like so many other new algorithms try to do, it embraces change and adapts to it in the way that water becomes whatever vessel it inhabits. This new approach- an industry first- could very well become an industry standard and in doing so, would usher in a new age for proof of work based coins. VerusHash 2.0 has the potential to correct the single largest design flaw in the proof of work consensus mechanism- the ever expanding monetary and energy requirements that have plagued PoW based projects since the inception of the consensus mechanism. Verus also solves another major issue of coin and net hash centralization by enabling legitimate CPU mining, offering greater coin and hashrate distribution.
Digging a bit deeper it turns out the Verus development team are no rookies. The lead developer Michael F Toutonghi has spent decades in the field programming and is a former Vice President and Technical Fellow at Microsoft, recognized founder and architect of Microsoft's .Net platform, ex-Technical Fellow of Microsoft's advertising platform, ex-CTO, Parallels Corporation, and an experienced distributed computing and machine learning architect. The project he helped create employs and makes use of a diverse myriad of technologies and security features to form one of the most advanced and secure cryptocurrency to date. A brief description of what makes VerusCoin special quoted from a community member-
"Verus has a unique and new consensus algorithm called Proof of Power which is a 50% PoW/50% PoS algorithm that solves theoretical weaknesses in other PoS systems (Nothing at Stake problem for example) and is provably immune to 51% hash attacks. With this, Verus uses the new hash algorithm, VerusHash 2.0. VerusHash 2.0 is designed to better equalize mining across all hardware platforms, while favoring the latest CPUs over older types, which is also one defense against the centralizing potential of botnets. Unlike past efforts to equalize hardware hash-rates across different hardware types, VerusHash 2.0 explicitly enables CPUs to gain even more power relative to GPUs and FPGAs, enabling the most decentralizing hardware, CPUs (due to their virtually complete market penetration), to stay relevant as miners for the indefinite future. As for anonymity, Verus is not a "forced private", allowing for both transparent and shielded (private) transactions...and private messages as well"
If other projects can learn from this and adopt a similar approach or continue to innovate with new ideas, it could mean an end to all the doom and gloom predictions that CPU and GPU mining are dead, offering a much needed reprieve and an alternative to miners who have been faced with the difficult decision of either pulling the plug and shutting down shop or breaking down their rigs to sell off parts and buy new, more expensive hardware…and in so doing present an overall unprecedented level of decentralization not yet seen in cryptocurrency.
Technological advancements led us to the world of secure digital currencies and the progress being made with hardware efficiencies is indisputably beneficial to us all. ASICs and FPGAs aren’t inherently bad, and there are ways in which they could be made more affordable and available for mass distribution. More than anything, it is important that we work together as communities to find solutions that can benefit us all for the long term.
In an ever changing world where it may be easy to lose sight of the real accomplishments that brought us to this point one thing is certain, cryptocurrency is here to stay and the projects that are doing something to solve the current problems in the proof of work consensus mechanism will be the ones that lead us toward our collective vision of a better world- not just for the world of crypto but for each and every one of us.
submitted by Godballz to EtherMining [link] [comments]

The Problem with PoW

The Problem with PoW

Miners have always had it rough..
"Frustrated Miners"


The Problem with PoW
(and what is being done to solve it)

Proof of Work (PoW) is one of the most commonly used consensus mechanisms entrusted to secure and validate many of today’s most successful cryptocurrencies, Bitcoin being one. Battle-hardened and having weathered the test of time, Bitcoin has demonstrated the undeniable strength and reliability of the PoW consensus model through sheer market saturation, and of course, its persistency.
In addition to the cost of powerful computing hardware, miners prove that they are benefiting the network by expending energy in the form of electricity, by solving and hashing away complex math problems on their computers, utilizing any suitable tools that they have at their disposal. The mathematics involved in securing proof of work revolve around unique algorithms, each with their own benefits and vulnerabilities, and can require different software/hardware to mine depending on the coin.
Because each block has a unique and entirely random hash, or “puzzle” to solve, the “work” has to be performed for each block individually and the difficulty of the problem can be increased as the speed at which blocks are solved increases.
Hashrates and Hardware Types
While proof of work is an effective means of securing a blockchain, it inherently promotes competition amongst miners seeking higher and higher hashrates due to the rewards earned by the node who wins the right to add the next block. In turn, these higher hash rates benefit the blockchain, providing better security when it’s a result of a well distributed/decentralized network of miners.
When Bitcoin first launched its genesis block, it was mined exclusively by CPUs. Over the years, various programmers and developers have devised newer, faster, and more energy efficient ways to generate higher hashrates; some by perfecting the software end of things, and others, when the incentives are great enough, create expensive specialized hardware such as ASICs (application-specific integrated circuit). With the express purpose of extracting every last bit of hashing power, efficiency being paramount, ASICs are stripped down, bare minimum, hardware representations of a specific coin’s algorithm.
This gives ASICS a massive advantage in terms of raw hashing power and also in terms of energy consumption against CPUs/GPUs, but with significant drawbacks of being very expensive to design/manufacture, translating to a high economic barrier for the casual miner. Due to the fact that they are virtual hardware representations of a single targeted algorithm, this means that if a project decides to fork and change algorithms suddenly, your powerful brand-new ASIC becomes a very expensive paperweight. The high costs in developing and manufacturing ASICs and the associated risks involved, make them unfit for mass adoption at this time.
Somewhere on the high end, in the vast hashrate expanse created between GPU and ASIC, sits the FPGA (field programmable gate array). FPGAs are basically ASICs that make some compromises with efficiency in order to have more flexibility, namely they are reprogrammable and often used in the “field” to test an algorithm before implementing it in an ASIC. As a precursor to the ASIC, FPGAs are somewhat similar to GPUs in their flexibility, but require advanced programming skills and, like ASICs, are expensive and still fairly uncommon.
2 Guys 1 ASIC
One of the issues with proof of work incentivizing the pursuit of higher hashrates is in how the network calculates block reward coinbase payouts and rewards miners based on the work that they have submitted. If a coin generated, say a block a minute, and this is a constant, then what happens if more miners jump on a network and do more work? The network cannot pay out more than 1 block reward per 1 minute, and so a difficulty mechanism is used to maintain balance. The difficulty will scale up and down in response to the overall nethash, so if many miners join the network, or extremely high hashing devices such as ASICs or FPGAs jump on, the network will respond accordingly, using the difficulty mechanism to make the problems harder, effectively giving an edge to hardware that can solve them faster, balancing the network. This not only maintains the block a minute reward but it has the added side-effect of energy requirements that scale up with network adoption.
Imagine, for example, if one miner gets on a network all alone with a CPU doing 50 MH/s and is getting all 100 coins that can possibly be paid out in a day. Then, if another miner jumps on the network with the same CPU, each miner would receive 50 coins in a day instead of 100 since they are splitting the required work evenly, despite the fact that the net electrical output has doubled along with the work. Electricity costs miner’s money and is a factor in driving up coin price along with adoption, and since more people are now mining, the coin is less centralized. Now let’s say a large corporation has found it profitable to manufacture an ASIC for this coin, knowing they will make their money back mining it or selling the units to professionals. They join the network doing 900 MH/s and will be pulling in 90 coins a day, while the two guys with their CPUs each get 5 now. Those two guys aren’t very happy, but the corporation is. Not only does this negatively affect the miners, it compromises the security of the entire network by centralizing the coin supply and hashrate, opening the doors to double spends and 51% attacks from potential malicious actors. Uncertainty of motives and questionable validity in a distributed ledger do not mix.
When technology advances in a field, it is usually applauded and welcomed with open arms, but in the world of crypto things can work quite differently. One of the glaring flaws in the current model and the advent of specialized hardware is that it’s never ending. Suppose the two men from the rather extreme example above took out a loan to get themselves that ASIC they heard about that can get them 90 coins a day? When they join the other ASIC on the network, the difficulty adjusts to keep daily payouts consistent at 100, and they will each receive only 33 coins instead of 90 since the reward is now being split three ways. Now what happens if a better ASIC is released by that corporation? Hopefully, those two guys were able to pay off their loans and sell their old ASICs before they became obsolete.
This system, as it stands now, only perpetuates a never ending hashrate arms race in which the weapons of choice are usually a combination of efficiency, economics, profitability and in some cases control.
Implications of Centralization
This brings us to another big concern with expensive specialized hardware: the risk of centralization. Because they are so expensive and inaccessible to the casual miner, ASICs and FPGAs predominantly remain limited to a select few. Centralization occurs when one small group or a single entity controls the vast majority hash power and, as a result, coin supply and is able to exert its influence to manipulate the market or in some cases, the network itself (usually the case of dishonest nodes or bad actors).
This is entirely antithetical of what cryptocurrency was born of, and since its inception many concerted efforts have been made to avoid centralization at all costs. An entity in control of a centralized coin would have the power to manipulate the price, and having a centralized hashrate would enable them to affect network usability, reliability, and even perform double spends leading to the demise of a coin, among other things.
The world of crypto is a strange new place, with rapidly growing advancements across many fields, economies, and boarders, leaving plenty of room for improvement; while it may feel like a never-ending game of catch up, there are many talented developers and programmers working around the clock to bring us all more sustainable solutions.
The Rise of FPGAs
With the recent implementation of the commonly used coding language C++, and due to their overall flexibility, FPGAs are becoming somewhat more common, especially in larger farms and in industrial setting; but they still remain primarily out of the hands of most mining enthusiasts and almost unheard of to the average hobby miner. Things appear to be changing though, one example of which I’ll discuss below, and it is thought by some, that soon we will see a day when mining with a CPU or GPU just won’t cut it any longer, and the market will be dominated by FPGAs and specialized ASICs, bringing with them efficiency gains for proof of work, while also carelessly leading us all towards the next round of spending.
A perfect real-world example of the effect specialized hardware has had on the crypto-community was recently discovered involving a fairly new project called VerusCoin and a fairly new, relatively more economically accessible FPGA. The FPGA is designed to target specific alt-coins whose algo’s do not require RAM overhead. It was discovered the company had released a new algorithm, kept secret from the public, which could effectively mine Verus at 20x the speed of GPUs, which were the next fastest hardware types mining on the Verus network.
Unfortunately this was done with a deliberately secret approach, calling the Verus algorithm “Algo1” and encouraging owners of the FPGA to never speak of the algorithm in public channels, admonishing a user when they did let the cat out of the bag. The problem with this business model is that it is parasitic in nature. In an ecosystem where advancements can benefit the entire crypto community, this sort of secret mining approach also does not support the philosophies set forth by the Bitcoin or subsequent open source and decentralization movements.
Although this was not done in the spirit of open source, it does hint to an important step in hardware innovation where we could see more efficient specialized systems within reach of the casual miner. The FPGA requires unique sets of data called a bitstream in order to be able to recognize each individual coin’s algorithm and mine them. Because it’s reprogrammable, with the support of a strong development team creating such bitstreams, the miner doesn’t end up with a brick if an algorithm changes.
All is not lost thanks to.. um.. Technology?
Shortly after discovering FPGAs on the network, the Verus developers quickly designed, tested, and implemented a new, much more complex and improved algorithm via a fork that enabled Verus to transition smoothly from VerusHash 1.0 to VerusHash 2.0 at block 310,000. Since the fork, VerusHash 2.0 has demonstrated doing exactly what it was designed for- equalizing hardware performance relative to the device being used while enabling CPUs (the most widely available “ASICs”) to mine side by side with GPUs, at a profit and it appears this will also apply to other specialized hardware. This is something no other project has been able to do until now. Rather than pursue the folly of so many other projects before it- attempting to be “ASIC proof”, Verus effectively achieved and presents to the world an entirely new model of “hardware homogeny”. As the late, great, Bruce Lee once said- “Don’t get set into one form, adapt it and build your own, and let it grow, be like water.”
In the design of VerusHash 2.0, Verus has shown it doesn’t resist progress like so many other new algorithms try to do, it embraces change and adapts to it in the way that water becomes whatever vessel it inhabits. This new approach- an industry first- could very well become an industry standard and in doing so, would usher in a new age for proof of work based coins. VerusHash 2.0 has the potential to correct the single largest design flaw in the proof of work consensus mechanism- the ever expanding monetary and energy requirements that have plagued PoW based projects since the inception of the consensus mechanism. Verus also solves another major issue of coin and net hash centralization by enabling legitimate CPU mining, offering greater coin and hashrate distribution.
Digging a bit deeper it turns out the Verus development team are no rookies. The lead developer Michael F Toutonghi has spent decades in the field programming and is a former Vice President and Technical Fellow at Microsoft, recognized founder and architect of Microsoft's .Net platform, ex-Technical Fellow of Microsoft's advertising platform, ex-CTO, Parallels Corporation, and an experienced distributed computing and machine learning architect. The project he helped create employs and makes use of a diverse myriad of technologies and security features to form one of the most advanced and secure cryptocurrency to date. A brief description of what makes VerusCoin special quoted from a community member-
"Verus has a unique and new consensus algorithm called Proof of Power which is a 50% PoW/50% PoS algorithm that solves theoretical weaknesses in other PoS systems (Nothing at Stake problem for example) and is provably immune to 51% hash attacks. With this, Verus uses the new hash algorithm, VerusHash 2.0. VerusHash 2.0 is designed to better equalize mining across all hardware platforms, while favoring the latest CPUs over older types, which is also one defense against the centralizing potential of botnets. Unlike past efforts to equalize hardware hash-rates across different hardware types, VerusHash 2.0 explicitly enables CPUs to gain even more power relative to GPUs and FPGAs, enabling the most decentralizing hardware, CPUs (due to their virtually complete market penetration), to stay relevant as miners for the indefinite future. As for anonymity, Verus is not a "forced private", allowing for both transparent and shielded (private) transactions...and private messages as well"
If other projects can learn from this and adopt a similar approach or continue to innovate with new ideas, it could mean an end to all the doom and gloom predictions that CPU and GPU mining are dead, offering a much needed reprieve and an alternative to miners who have been faced with the difficult decision of either pulling the plug and shutting down shop or breaking down their rigs to sell off parts and buy new, more expensive hardware…and in so doing present an overall unprecedented level of decentralization not yet seen in cryptocurrency.
Technological advancements led us to the world of secure digital currencies and the progress being made with hardware efficiencies is indisputably beneficial to us all. ASICs and FPGAs aren’t inherently bad, and there are ways in which they could be made more affordable and available for mass distribution. More than anything, it is important that we work together as communities to find solutions that can benefit us all for the long term.
In an ever changing world where it may be easy to lose sight of the real accomplishments that brought us to this point one thing is certain, cryptocurrency is here to stay and the projects that are doing something to solve the current problems in the proof of work consensus mechanism will be the ones that lead us toward our collective vision of a better world- not just for the world of crypto but for each and every one of us.
submitted by Godballz to gpumining [link] [comments]

Console gaming is hardly different from PC gaming, and much of what people say about PC gaming to put it above console gaming is often wrong.

I’m not sure about you, but for the past few years, I’ve been hearing people go on and on about PCs "superiority" to the console market. People cite various reasons why they believe gaming on a PC is “objectively” better than console gaming, often for reasons related to power, costs, ease-of-use, and freedom.
…Only problem: much of what they say is wrong.
There are many misconceptions being thrown about PC gaming vs Console gaming, that I believe need to be addressed. This isn’t about “PC gamers being wrong,” or “consoles being the best,” absolutely not. I just want to cut through some of the stuff people use to put down console gaming, and show that console gaming is incredibly similar to PC gaming. I mean, yes, this is someone who mainly games on console, but I also am getting a new PC that I will game on as well, not to mention the 30 PC games I already own and play. I’m not particularly partial to one over the other.
Now I will mainly be focusing on the PlayStation side of the consoles, because I know it best, but much of what I say will apply to Xbox as well. Just because I don’t point out many specific Xbox examples, doesn’t mean that they aren’t out there.

“PCs can use TVs and monitors.”

This one isn’t so much of a misconception as it is the implication of one, and overall just… confusing. This is in some articles and the pcmasterrace “why choose a PC” section, where they’re practically implying that consoles can’t do this. I mean, yes, as long as the ports of your PC match up with your screen(s) inputs, you could plug a PC into either… but you could do the same with a console, again, as long as the ports match up.
I’m guessing the idea here is that gaming monitors often use Displayport, as do most dedicated GPUs, and consoles are generally restricted to HDMI… But even so, monitors often have HDMI ports. In fact, PC Magazine has just released their list of the best gaming monitors of 2017, and every single one of them has an HDMI port. A PS4 can be plugged into these just as easily as a GTX 1080.
I mean, even if the monitoTV doesn’t have HDMI or AV to connect with your console, just use an adaptor. If you have a PC with ports that doesn’t match your monitoTV… use an adapter. I don’t know what the point of this argument is, but it’s made a worrying amount of times.

“On PC, you have a wide range of controller options, but on console you’re stuck with the standard controller."

Are you on PlayStation and wish you could use a specific type of controller that suits your favorite kind of gameplay? Despite what some may believe, you have just as many options as PC.
Want to play fighting games with a classic arcade-style board, featuring the buttons and joystick? Here you go!
Want to get serious about racing and get something more accurate and immersive than a controller? Got you covered.
Absolutely crazy about flying games and, like the racers, want something better than a controller? Enjoy!
Want Wii-style motion controls? Been around since the PS3. If you prefer the form factor of the Xbox One controller but you own a PS4, Hori’s got you covered. And of course, if keyboard and mouse it what keeps you on PC, there’s a PlayStation compatible solution for that. Want to use the keyboard and mouse that you already own? Where there’s a will, there’s a way.
Of course, these aren’t isolated examples, there are plenty of options for each of these kind of controllers. You don’t have to be on PC to enjoy alternate controllers.

“On PC you could use Steam Link to play anywhere in your house and share games with others.”

PS4 Remote play app on PC/Mac, PSTV, and PS Vita.
PS Family Sharing.
Using the same PSN account on multiple PS4s/Xbox Ones and PS3s/360s, or using multiple accounts on the same console.
In fact, if multiple users are on the same PS4, only one has to buy the game for both users to play it on that one PS4. On top of that, only one of them has to have PS Plus for both to play online (if the one with PS Plus registers the PS4 as their main system).
PS4 Share Play; if two people on separate PS4s want to play a game together that only one of them owns, they can join a Party and the owner of the game can have their friend play with them in the game.
Need I say more?

“Gaming is more expensive on console.”

Part one, the Software
This is one that I find… genuinely surprising. There’s been a few times I’ve mentioned that part of the reason I chose a PS4 is for budget gaming, only to told that “games are cheaper on Steam.” To be fair, there are a few games on PSN/XBL that are more expensive than they are on Steam, so I can see how someone could believe this… but apparently they forgot about disks.
Dirt Rally, a hardcore racing sim game that’s… still $60 on all 3 platforms digitally… even though its successor is out.
So does this mean you have to pay full retail for this racing experience? Nope, because disk prices.
Just Cause 3, an insane open-world experience that could essentially be summed up as “break stuff, screw physics.” And it’s a good example of where the Steam price is lower than PSN and XBL:
Not by much, but still cheaper on Steam, so cheaper on PC… Until you look at the disk prices.
See my point? Often times the game is cheaper on console because of the disk alternative that’s available for practically every console-available game. Even when the game is brand new.
Dirt 4 - Remember that Dirt Rally successor I mentioned?
Yes, you could either buy this relatively new game digitally for $60, or just pick up the disk for a discounted price. And again, this is for a game that came out 2 months ago, and even it’s predecessor’s digital cost is locked at $60. Of course, I’m not going to ignore the fact that Dirt 4 is currently (as of writing this) discounted on Steam, but on PSN it also happens to be discounted for about the same amount.
Part 2: the Subscription
Now… let’s not ignore the elephant in the room: PS Plus and Xbox Gold. Now these would be ignorable, if they weren’t required for online play (on the PlayStation side, it’s only required for PS4, but still). So yes, it’s still something that will be included in the cost of your PS4 or Xbox One/360, assuming you play online. Bummer, right?
Here’s the thing, although that’s the case, although you have to factor in this $60 cost with your console, you can make it balance out, at worst, and make it work out for you as a budget gamer, at best. As nice as it would be to not have to deal with the price if you don’t want to, it’s not like it’s a problem if you use it correctly.
Imagine going to a new restaurant. This restaurant has some meals that you can’t get anywhere else, and fair prices compared to competitors. Only problem: you have to pay a membership fee to have the sides. Now you can have the main course, sit down and enjoy your steak or pasta, but if you want to have a side to have a full meal, you have to pay an annual fee.
Sounds shitty, right? But here’s the thing: not only does this membership allow you to have sides with your meal, but it also allows you to eat two meals for free every month, and also gives you exclusive discounts for other meals, drinks, and desserts.
Let’s look at PS Plus for a minute: for $60 per year, you get:
  • 2 free PS4 games, every month
  • 2 free PS3 games, every month
  • 1 PS4/PS3 and Vita compatible game, and 1 Vita-only game, every month
  • Exclusive/Extended discounts, especially during the weekly/seasonal sales (though you don’t need PS Plus to get sales, PS Plus members get to enjoy the best sales)
  • access to online multiplayer
So yes, you’re paying extra because of that membership, but what you get with that deal pays for it and then some. In fact, let’s ignore the discounts for a minute: you get 24 free PS4 games, 24 free PS3 games, and 12 Vita only + 12 Vita compatible games, up to 72 free games every year. Even if you only one of these consoles, that’s still 24 free games a year. Sure, maybe you get games for the month that you don’t like, then just wait until next month.
In fact, let’s look at Just Cause 3 again. It was free for PS Plus members in August, which is a pretty big deal. Why is this significant? Because it’s, again, a $60 digital game. That means with this one download, you’ve balanced out your $60 annual fee. Meaning? Every free game after that is money saved, every discount after that is money saved. And this is a trend: every year, PS Plus will release a game that balances out the entire service cost, then another 23 more that will only add icing to that budget cake. Though, you could just count games as paying off PS Plus until you hit $60 in savings, but still.
All in all, PS Plus, and Xbox Gold which offers similar options, saves you money. On top of that, again, you don't need to have these to get discounts, but with these memberships, you get more discounts.
Now, I’ve seen a few Steam games go up for free for a week, but what about being free for an entire month? Not to mention that; even if you want to talk about Steam Summer Sales, what about the PSN summer sale, or again, disc sale discounts? Now a lot of research and math would be needed to see if every console gamer would save money compared to every Steam gamer for the same games, but at the very least? The costs will balance out, at worst.
Part 3, the Systems
  • Xbox and PS2: $299
  • Xbox 360 and PS3: $299 and $499, respectively
  • Xbox One and PS4: $499 and $399, respectively.
Rounded up a few dollars, that’s $1,000 - $1,300 in day-one consoles, just to keep up with the games! Crazy right? So called budget systems, such a rip-off.
Well, keep in mind that the generations here aren’t short.
The 6th generation, from the launch of the PS2 to the launch of the next generation consoles, lasted 5 years, 6 years based on the launch of the PS3 (though you could say it was 9 or 14, since the Xbox wasn’t discontinued until 2009, and the PS2 was supported all the way to 2014, a year after the PS4 was released). The 7th gen lasted 7 - 8 years, again depending on whether you count the launch of the Xbox 360 to PS3. The 8th gen so far has lasted 4 years. That’s 17 years that the console money is spread over. If you had a Netflix subscription for it’s original $8 monthly plan for that amount of time, that would be over $1,600 total.
And let’s be fair here, just like you could upgrade your PC hardware whenever you wanted, you didn’t have to get a console from launch. Let’s look at PlayStation again for example: In 2002, only two years after its release, the PS2 retail price was cut from $300 to $200. The PS3 Slim, released 3 years after the original, was $300, $100-$200 lower than the retail cost. The PS4? You could’ve either gotten the Uncharted bundle for $350, or one of the PS4 Slim bundles for $250. This all brings it down to $750 - $850, which again, is spread over a decade and a half. This isn’t even counting used consoles, sales, or the further price cuts that I didn’t mention.
Even if that still sounds like a lot of money to you, even if you’re laughing at the thought of buying new systems every several years, because your PC “is never obsolete,” tell me: how many parts have you changed out in your PC over the years? How many GPUs have you been through? CPUs? Motherboards? RAM sticks, monitors, keyboards, mice, CPU coolers, hard drives— that adds up. You don’t need to replace your entire system to spend a lot of money on hardware.
Even if you weren’t upgrading for the sake of upgrading, I’d be amazed if the hardware you’ve been pushing by gaming would last for about 1/3 of that 17 year period. Computer parts aren’t designed to last forever, and really won’t when you’re pushing them with intensive gaming for hours upon hours. Generally speaking, your components might last you 6-8 years, if you’ve got the high-end stuff. But let’s assume you bought a system 17 years ago that was a beast for it’s time, something so powerful, that even if it’s parts have degraded over time, it’s still going strong. Problem is: you will have to upgrade something eventually.
Even if you’ve managed to get this far into the gaming realm with the same 17 year old hardware, I’m betting you didn’t do it with a 17 year Operating System. How much did Windows 7 cost you? Or 8.1? Or 10? Oh, and don’t think you can skirt the cost by getting a pre-built system, the cost of Windows is embedded into the cost of the machine (why else would Microsoft allow their OS to go on so many machines).
Sure, Windows 10 was a free upgrade for a year, but that’s only half of it’s lifetime— You can’t get it for free now, and not for the past year. On top of that, the free period was an upgrade; you had to pay for 7 or 8 first anyway.
Point is, as much as one would like to say that they didn’t need to buy a new system every so often for the sake of gaming, that doesn’t mean they haven’t been paying for hardware, and even if they’ve only been PC gaming recently, you’ll be spending money on hardware soon enough.

“PC is leading the VR—“

Let me stop you right there.
If you add together the total number of Oculus Rifts and HTC Vives sold to this day, and threw in another 100,000 just for the sake of it, that number would still be under the number of PSVR headsets sold.
Why could this possibly be? Well, for a simple reason: affordability. The systems needed to run the PC headsets costs $800+, and the headsets are $500 - $600, when discounted. PSVR on the other hand costs $450 for the full bundle (headset, camera, and move controllers, with a demo disc thrown in), and can be played on either a $250 - $300 console, or a $400 console, the latter recommended. Even if you want to say that the Vive and Rift are more refined, a full PSVR set, system and all, could cost just over $100 more than a Vive headset alone.
If anything, PC isn’t leading the VR gaming market, the PS4 is. It’s the system bringing VR to the most consumers, showing them what the future of gaming could look like. Not to mention that as the PlayStation line grows more powerful (4.2 TFLOP PS4 Pro, 10 TFLOP “PS5…”), it won’t be long until the PlayStation line can use the same VR games as PC.
Either way, this shows that there is a console equivalent to the PC VR options. Sure, there are some games you'd only be able to play on PC, but there are also some games you'd only be able to play on PSVR.
…Though to be fair, if we’re talking about VR in general, these headsets don’t even hold a candle to, surprisingly, Gear VR.

“If it wasn’t for consoles holding devs back, then they would be able to make higher quality games.”

This one is based on the idea that because of how “low spec” consoles are, that when a developer has to take them in mind, then they can’t design the game to be nearly as good as it would be otherwise. I mean, have you ever seen the minimum specs for games on Steam?
GTA V
  • CPU: Intel Core 2 Quad CPU Q6600 @ 2.40GHz (4 CPUs) / AMD Phenom 9850 Quad-Core Processor (4 CPUs) @ 2.5GHz
  • Memory: 4 GB RAM
  • GPU: NVIDIA 9800 GT 1GB / AMD HD 4870 1GB (DX 10, 10.1, 11)
Just Cause 3
  • CPU: Intel Core i5-2500k, 3.3GHz / AMD Phenom II X6 1075T 3GHz
  • Memory: 8 GB RAM
  • GPU: NVIDIA GeForce GTX 670 (2GB) / AMD Radeon HD 7870 (2GB)
Fallout 4
  • CPU: Intel Core i5-2300 2.8 GHz/AMD Phenom II X4 945 3.0 GHz or equivalent
  • Memory: 8 GB RAM
  • GPU: NVIDIA GTX 550 Ti 2GB/AMD Radeon HD 7870 2GB or equivalent
Overwatch
  • CPU: Intel Core i3 or AMD Phenom™ X3 8650
  • Memory: 4 GB RAM
  • GPU: NVIDIA® GeForce® GTX 460, ATI Radeon™ HD 4850, or Intel® HD Graphics 4400
Witcher 3
  • Processor: Intel CPU Core i5-2500K 3.3GHz / AMD CPU Phenom II X4 940
  • Memory: 6 GB RAM
  • Graphics: Nvidia GPU GeForce GTX 660 / AMD GPU Radeon HD 7870
Actually, bump up all the memory requirements to 8 GBs, and those are some decent specs, relatively speaking. And keep in mind these are the minimum specs to even open the games. It’s almost as if the devs didn’t worry about console specs when making a PC version of the game, because this version of the game isn’t on console. Or maybe even that the consoles aren’t holding the games back that much because they’re not that weak. Just a hypothesis.
But I mean, the devs are still ooobviously having to take weak consoles into mind right? They could make their games sooo much more powerful if they were PC only, right? Right?
No. Not even close.
iRacing
  • CPU: Intel Core i3, i5, i7 or better or AMD Bulldozer or better
  • Memory: 8 GB RAM
  • GPU: NVidia GeForce 2xx series or better, 1GB+ dedicated video memory / AMD 5xxx series or better, 1GB+ dedicated video memory
Playerunknown’s Battlegrounds
  • CPU: Intel Core i3-4340 / AMD FX-6300
  • Memory: 6 GB RAM
  • GPU: nVidia GeForce GTX 660 2GB / AMD Radeon HD 7850 2GB
These are PC only games. That’s right, no consoles to hold them back, they don’t have to worry about whether an Xbox One could handle it. Yet, they don’t require anything more than the Multiplatform games.
Subnautica
  • CPU: Intel Haswell 2 cores / 4 threads @ 2.5Ghz or equivalent
  • Memory: 4GB
  • GPU: Intel HD 4600 or equivalent - This includes most GPUs scoring greater than 950pts in the 3DMark Fire Strike benchmark
Rust
  • CPU: 2 ghz
  • Memory: 8 GB RAM
  • DirectX: Version 11 (they don’t even list a GPU)
So what’s the deal? Theoretically, if developers don’t have to worry about console specs, then why aren’t they going all-out and making games that no console could even dream of supporting?
Low-end PCs.
What, did you think people only game on Steam if they spent at least $500 on gaming hardware? Not all PC gamers have gaming-PC specs, and if devs close their games out to players who don’t have the strongest of PCs, then they’d be losing out on a pretty sizable chunk of their potential buyers.
Saying “devs having to deal with consoles is holding gaming back” is like saying “racing teams having to deal with Ford is holding GT racing back.” A: racing teams don’t have to deal with Ford if they don’t want to, which is probably why many of them don’t, and B: even though Ford doesn’t make the fastest cars overall, they still manage to make cars that are awesome on their own, they don’t even need to be compared to anything else to know that they make good cars.
I want to go back to that previous point though, developers having to deal with low-end PCs, because it’s integral to the next point:

“PCs are more powerful, gaming on PC provides a better experience.”

This one isn’t so much of a misconception as it is… misleading.
Did you know that according to the Steam Hardware & Software Survey (July 2017) , the percentage of Steam gamers who use a GPU that's less powerful than that of a PS4 Slim’s GPU is well over 50%? Things get dismal when compared to the PS4 Pro (Or Xbox One X). On top of that, the percentage of PC gamers who own a Nvidia 10 series card is about 20% (about 15% for the 1060, 1080 and 1070 owners).
Now to be fair, the large majority of gamers have CPUs with considerably high clock speeds, which is the main factor in CPU gaming performance. But, the number of Steam gamers with as much RAM or more than a PS4 or Xbox One is less than 50%, which can really bottleneck what those CPUs can handle.
These numbers are hardly better than they were in 2013, all things considered. Sure, a PS3/360 weeps in the face of even a $400 PC, but in this day in age, consoles have definitely caught up.
Sure, we could mention the fact that even 1% of Steam accounts represents over 1 million accounts, but that doesn’t really matter compared to the 10s of millions of 8th gen consoles sold; looking at it that way, sure the number of Nvidia 10 series owners is over 20 million, but that ignores the fact that there are over 5 times more 8th gen consoles sold than that.
Basically, even though PCs run on a spectrum, saying they're more powerful “on average” is actually wrong. Sure, they have the potential for being more powerful, but most of the time, people aren’t willing to pay the premium to reach those extra bits of performance.
Now why is this important? What matters are the people who spent the premium cost for premium parts, right? Because of the previous point: PCs don’t have some ubiquitous quality over the consoles, developers will always have to keep low-end PCs in mind, because not even half of all PC players can afford the good stuff, and you have to look at the top quarter of Steam players before you get to PS4-Pro-level specs. If every Steam player were to get a PS4 Pro, it would be an upgrade for over 60% of them, and 70% of them would be getting an upgrade with the Xbox One X.
Sure, you could still make the argument that when you pay more for PC parts, you get a better experience than you could with a console. We can argue all day about budget PCs, but a console can’t match up to a $1,000 PC build. It’s the same as paying more for car parts, in the end you get a better car. However, there is a certain problem with that…

“You pay a little more for a PC, you get much more quality.”

The idea here is that the more you pay for PC parts, the performance increases at a faster rate than the price does. Problem: that’s not how technology works. Paying twice as much doesn’t get you twice the quality the majority of the time.
For example, let’s look at graphics cards, specifically the GeForce 10 series cards, starting with the GTX 1050.
  • 1.8 TFLOP
  • 1.35 GHz base clock
  • 2 GB VRAM
  • $110
This is our reference, our basis of comparison. Any percentages will be based on the 1050’s specs.
Now let’s look at the GTX 1050 Ti, the 1050’s older brother.
  • 2.1 TFLOP
  • 1.29 GHz base clock
  • 4 GB VRAM
  • $140 retail
This is pretty good. You only increase the price by about 27%, and you get an 11% increase in floating point speed and a 100% increase (double) in VRAM. Sure you get a slightly lower base clock, but the rest definitely makes up for it. In fact, according to GPU boss, the Ti managed 66 fps, or a 22% increase in frame rate for Battlefield 4, and a 54% increase in mHash/second in bitcoin mining. The cost increase is worth it, for the most part.
But let’s get to the real meat of it; what happens when we double our budget? Surely we should see a massive increase performance, I bet some of you are willing to bet that twice the cost means more than twice the performance.
The closest price comparison for double the cost is the GTX 1060 (3 GB), so let’s get a look at that.
  • 3.0 TFLOP
  • 1.5 GHz base clock
  • 3 GB VRAM
  • $200 retail
Well… not substantial, I’d say. About a 50% increase in floating point speed, an 11% increase in base clock speed, and a 1GB decrease in VRAM. For [almost] doubling the price, you don’t get much.
Well surely raw specs don’t tell the full story, right? Well, let’s look at some real wold comparisons. Once again, according to GPU Boss, there’s a 138% increase in hashes/second for bitcoin mining, and at 99 fps, an 83% frame rate increase in Battlefield 4. Well, then, raw specs does not tell the whole story!
Here’s another one, the 1060’s big brother… or, well, slightly-more-developed twin.
  • 3.9 TFLOP
  • 1.5 GHz base clock
  • 6 GB VRAM
  • $250 retail
Seems reasonable, another $50 for a decent jump in power and double the memory! But, as we’ve learned, we shouldn’t look at the specs for the full story.
I did do a GPU Boss comparison, but for the BF4 frame rate, I had to look at Tom’s Hardware (sorry miners, GPU boss didn’t cover the mHash/sec spec either). What’s the verdict? Well, pretty good, I’d say. With 97 FPS, a 79% increase over the 1050— wait. 97? That seems too low… I mean, the 3GB version got 99.
Well, let’s see what Tech Power Up has to say...
94.3 fps. 74% increase. Huh.
Alright alright, maybe that was just a dud. We can gloss over that I guess. Ok, one more, but let’s go for the big fish: the GTX 1080.
  • 9.0 TFLOP
  • 1.6 GHz base clock
  • 8 GB VRAM
  • $500 retail
That jump in floating point speed definitely has to be something, and 4 times the VRAM? Sure it’s 5 times the price, but as we saw, raw power doesn’t always tell the full story. GPU Boss returns to give us the run down, how do these cards compare in the real world?
Well… a 222% (over three-fold) increase in mHash speed, and a 218% increase in FPS for Battlefield 4. That’s right, for 5 times the cost, you get 3 times the performance. Truly, the raw specs don’t tell the full story.
You increase the cost by 27%, you increase frame rate in our example game by 22%. You increase the cost by 83%, you increase the frame rate by 83%. Sounds good, but if you increase the cost by 129%, and you get a 79% (-50% cost/power increase) increase in frame rate. You increase it by 358%, and you increase the frame rate by 218% (-140% cost/power increase). That’s not paying “more for much more power,” that’s a steep drop-off after the third cheapest option.
In fact, did you know that you have to get to the 1060 (6GB) before you could compare the GTX line to a PS4 Pro? Not to mention that at $250, the price of a 1060 (6GB) you could get an entire PS4 Slim bundle, or that you have to get to the 1070 before you beat the Xbox One X.
On another note, let’s look at a PS4 Slim…
  • 1.84 TFLOP
  • 800 MHz base clock
  • 8 GB VRAM
  • $300 retail
…Versus a PS4 Pro.
  • 4.2 TFLOP
  • 911 MHz base clock
  • 8 GB VRAM
  • $400 retail
128% increase in floating point speed, 13% increase in clock speed, for a 25% difference in cost. Unfortunately there is no Battlefield 4 comparison to make, but in BF1, the frame rate is doubled (30 fps to 60) and the textures are taken to 11. For what that looks like, I’ll leave it up to this bloke. Not to even mention that you can even get the texture buffs in 4K. Just like how you get a decent increase in performance based on price for the lower-cost GPUs, the same applies here.
It’s even worse when you look at the CPU for a gaming PC. The more money you spend, again, the less of a benefit you get per dollar. Hardware Unboxed covers this in a video comparing different levels of Intel CPUs. One thing to note is that the highest i7 option (6700K) in this video was almost always within 10 FPS (though for a few games, 15 FPS) of a certain CPU in that list for just about all of the games.
…That CPU was the lowest i3 (6100) option. The lowest i3 was $117 and the highest i7 was $339, a 189% price difference for what was, on average, a 30% or less difference in frame rate. Even the lowest Pentium option (G4400, $63) was often able to keep up with the i7.
The CPU and GPU are usually the most expensive and power-consuming parts of a build, which is why I focused on them (other than the fact that they’re the two most important parts of a gaming PC, outside of RAM). With both, this “pay more to get much more performance” idea is pretty much the inverse of the truth.

“The console giants are bad for game developers, Steam doesn't treat developers as bad as Microsoft or especially Sony.”

Now one thing you might’ve heard is that the PS3 was incredibly difficult for developers to make games for, which for some, fueled the idea that console hardware is difficult too develop on compared to PC… but this ignores a very basic idea that we’ve already touched on: if the devs don’t want to make the game compatible with a system, they don’t have to. In fact, this is why Left 4 Dead and other Valve games aren’t on PS3, because they didn’t want to work with it’s hardware, calling it “too complex.” This didn’t stop the game from selling well over 10 million units worldwide. If anything, this was a problem for the PS3, not the dev team.
This also ignores that games like LittleBigPlanet, Grand Theft Auto IV, and Metal Gear Solid 4 all came out in the same year as Left 4 Dead (2008) on PS3. Apparently, plenty of other dev teams didn’t have much of a problem with the PS3’s hardware, or at the very least, they got used to it soon enough.
On top of that, when developing the 8th gen consoles, both Sony and Microsoft sought to use CPUs that were easier for developers, which included making decisions that considered apps for the consoles’ usage for more than gaming. On top of that, using their single-chip proprietary CPUs is cheaper and more energy efficient than buying pre-made CPUs and boards, which is far better of a reason for using them than some conspiracy about Sony and MS trying to make devs' lives harder.
Now, console exclusives are apparently a point of contention: it’s often said that exclusive can cause developers to go bankrupt. However, exclusivity doesn’t have to be a bad thing for the developer. For example, when Media Molecule had to pitch their game to a publisher (Sony, coincidentally), they didn’t end up being tied into something detrimental to them.
Their initial funding lasted for 6 months. From then, Sony offered additional funding, in exchange for Console Exclusivity. This may sound concerning to some, but the game ended up going on to sell almost 6 million units worldwide and launched Media Molecule into the gaming limelight. Sony later bought the development studio, but 1: this was in 2010, two years after LittleBigPlanet’s release, and 2: Media Molecule seem pretty happy about it to this day. If anything, signing up with Sony was one of the best things they could’ve done, in their opinion.
Does this sound like a company that has it out for developers? There are plenty of examples that people will use to put Valve in a good light, but even Sony is comparatively good to developers.

“There are more PC gamers.”

The total number of active PC gamers on Steam has surpassed 120 million, which is impressive, especially considering that this number is double that of 2013’s figure (65 million). But the number of monthly active users on Xbox Live and PSN? About 120 million (1, 2) total. EDIT: You could argue that this isn't an apples-to-apples comparison, sure, so if you want to, say, compare the monthly number of Steam users to console? Steam has about half of what consoles do, at 67 million.
Now, back to the 65 million total user figure for Steam, the best I could find for reference for PlayStation's number was an article giving the number of registered PSN accounts in 2013, 150 million. In a similar 4-year period (2009 - 2013), the number of registered PSN accounts didn’t double, it sextupled, or increased by 6 fold. Considering how the PS4 is already at 2/3 of the number of sales the PS3 had, even though it’s currently 3 years younger than its predecessor, I’m sure this trend is at least generally consistent.
For example, let’s look at DOOM 2016, an awesome faced-paced shooting title with graphics galore… Of course, on a single platform, it sold best on PC/Steam. 2.36 million Steam sales, 2.05 million PS4 sales, 1.01 million Xbox One sales.
But keep in mind… when you add the consoles sales together, you get over 3 million sales on the 8th gen systems. Meaning: this game was best sold on console. In fact, the Steam sales have only recently surpassed the PS4 sales. By the way VG charts only shows sales for physical copies of the games, so the number of PS4 and Xbox sales, when digital sales are included, are even higher than 3 million.
This isn’t uncommon, by the way.
Even with the games were the PC sales are higher than either of the consoles, there generally are more console sales total. But, to be fair, this isn’t anything new. The number of PC gamers hasn’t dominated the market, the percentages have always been about this much. PC can end up being the largest single platform for games, but consoles usually sell more copies total.
EDIT: There were other examples but... Reddit has a 40,000-character limit.

"Modding is only on PC."

Xbox One is already working on it, and Bethesda is helping with that.
PS4 isn't far behind either. You could argue that these are what would be the beta stages of modding, but that just means modding on consoles will only grow.

What’s the Point?

This isn’t to say that there’s anything wrong with PC gaming, and this isn’t to exalt consoles. I’m not here to be the hipster defending the little guy, nor to be the one to try to put down someone/thing out of spite. This is about showing that PCs and consoles are overall pretty similar because there isn’t much dividing them, and that there isn’t anything wrong with being a console gamer. There isn’t some chasm separating consoles and PCs, at the end of the day they’re both computers that are (generally) designed for gaming. This about unity as gamers, to try to show that there shouldn’t be a massive divide just because of the computer system you game on. I want gamers to be in an environment where specs don't separate us; whether you got a $250 PS4 Slim or just built a $2,500 gaming PC, we’re here to game and should be able to have healthy interactions regardless of your platform.
I’m well aware that this isn’t going to fix… much, but this needs to be said: there isn’t a huge divide between the PC and consoles, they’re far more similar than people think. There are upsides and downsides that one has that the other doesn’t on both sides. There’s so much more I could touch on, like how you could use SSDs or 3.5 inch hard drives with both, or that even though PC part prices go down over time, so do consoles, but I just wanted to touch on the main points people try to use to needlessly separate the two kinds of systems (looking at you PCMR) and correct them, to get the point across.
I thank anyone who takes the time to read all of this, and especially anyone who doesn’t take what I say out of context. I also want to note that, again, this isn’tanti-PC gamer.” If it were up to me, everyone would be a hybrid gamer.
Cheers.
submitted by WhyyyCantWeBeFriends to unpopularopinion [link] [comments]

The Problem with PoW

"Frustrated Miners"

The Problem with PoW
(and what is being done to solve it)

Proof of Work (PoW) is one of the most commonly used consensus mechanisms entrusted to secure and validate many of today’s most successful cryptocurrencies, Bitcoin being one. Battle-hardened and having weathered the test of time, Bitcoin has demonstrated the undeniable strength and reliability of the PoW consensus model through sheer market saturation, and of course, its persistency.
In addition to the cost of powerful computing hardware, miners prove that they are benefiting the network by expending energy in the form of electricity, by solving and hashing away complex math problems on their computers, utilizing any suitable tools that they have at their disposal. The mathematics involved in securing proof of work revolve around unique algorithms, each with their own benefits and vulnerabilities, and can require different software/hardware to mine depending on the coin.
Because each block has a unique and entirely random hash, or “puzzle” to solve, the “work” has to be performed for each block individually and the difficulty of the problem can be increased as the speed at which blocks are solved increases.

Hashrates and Hardware Types

While proof of work is an effective means of securing a blockchain, it inherently promotes competition amongst miners seeking higher and higher hashrates due to the rewards earned by the node who wins the right to add the next block. In turn, these higher hash rates benefit the blockchain, providing better security when it’s a result of a well distributed/decentralized network of miners.
When Bitcoin first launched its genesis block, it was mined exclusively by CPUs. Over the years, various programmers and developers have devised newer, faster, and more energy efficient ways to generate higher hashrates; some by perfecting the software end of things, and others, when the incentives are great enough, create expensive specialized hardware such as ASICs (application-specific integrated circuit). With the express purpose of extracting every last bit of hashing power, efficiency being paramount, ASICs are stripped down, bare minimum, hardware representations of a specific coin’s algorithm.
This gives ASICS a massive advantage in terms of raw hashing power and also in terms of energy consumption against CPUs/GPUs, but with significant drawbacks of being very expensive to design/manufacture, translating to a high economic barrier for the casual miner. Due to the fact that they are virtual hardware representations of a single targeted algorithm, this means that if a project decides to fork and change algorithms suddenly, your powerful brand-new ASIC becomes a very expensive paperweight. The high costs in developing and manufacturing ASICs and the associated risks involved, make them unfit for mass adoption at this time.
Somewhere on the high end, in the vast hashrate expanse created between GPU and ASIC, sits the FPGA (field programmable gate array). FPGAs are basically ASICs that make some compromises with efficiency in order to have more flexibility, namely they are reprogrammable and often used in the “field” to test an algorithm before implementing it in an ASIC. As a precursor to the ASIC, FPGAs are somewhat similar to GPUs in their flexibility, but require advanced programming skills and, like ASICs, are expensive and still fairly uncommon.

2 Guys 1 ASIC

One of the issues with proof of work incentivizing the pursuit of higher hashrates is in how the network calculates block reward coinbase payouts and rewards miners based on the work that they have submitted. If a coin generated, say a block a minute, and this is a constant, then what happens if more miners jump on a network and do more work? The network cannot pay out more than 1 block reward per 1 minute, and so a difficulty mechanism is used to maintain balance. The difficulty will scale up and down in response to the overall nethash, so if many miners join the network, or extremely high hashing devices such as ASICs or FPGAs jump on, the network will respond accordingly, using the difficulty mechanism to make the problems harder, effectively giving an edge to hardware that can solve them faster, balancing the network. This not only maintains the block a minute reward but it has the added side-effect of energy requirements that scale up with network adoption.
Imagine, for example, if one miner gets on a network all alone with a CPU doing 50 MH/s and is getting all 100 coins that can possibly be paid out in a day. Then, if another miner jumps on the network with the same CPU, each miner would receive 50 coins in a day instead of 100 since they are splitting the required work evenly, despite the fact that the net electrical output has doubled along with the work. Electricity costs miner’s money and is a factor in driving up coin price along with adoption, and since more people are now mining, the coin is less centralized. Now let’s say a large corporation has found it profitable to manufacture an ASIC for this coin, knowing they will make their money back mining it or selling the units to professionals. They join the network doing 900 MH/s and will be pulling in 90 coins a day, while the two guys with their CPUs each get 5 now. Those two guys aren’t very happy, but the corporation is. Not only does this negatively affect the miners, it compromises the security of the entire network by centralizing the coin supply and hashrate, opening the doors to double spends and 51% attacks from potential malicious actors. Uncertainty of motives and questionable validity in a distributed ledger do not mix.
When technology advances in a field, it is usually applauded and welcomed with open arms, but in the world of crypto things can work quite differently. One of the glaring flaws in the current model and the advent of specialized hardware is that it’s never ending. Suppose the two men from the rather extreme example above took out a loan to get themselves that ASIC they heard about that can get them 90 coins a day? When they join the other ASIC on the network, the difficulty adjusts to keep daily payouts consistent at 100, and they will each receive only 33 coins instead of 90 since the reward is now being split three ways. Now what happens if a better ASIC is released by that corporation? Hopefully, those two guys were able to pay off their loans and sell their old ASICs before they became obsolete.
This system, as it stands now, only perpetuates a never ending hashrate arms race in which the weapons of choice are usually a combination of efficiency, economics, profitability and in some cases control.

Implications of Centralization

This brings us to another big concern with expensive specialized hardware: the risk of centralization. Because they are so expensive and inaccessible to the casual miner, ASICs and FPGAs predominantly remain limited to a select few. Centralization occurs when one small group or a single entity controls the vast majority hash power and, as a result, coin supply and is able to exert its influence to manipulate the market or in some cases, the network itself (usually the case of dishonest nodes or bad actors).
This is entirely antithetical of what cryptocurrency was born of, and since its inception many concerted efforts have been made to avoid centralization at all costs. An entity in control of a centralized coin would have the power to manipulate the price, and having a centralized hashrate would enable them to affect network usability, reliability, and even perform double spends leading to the demise of a coin, among other things.
The world of crypto is a strange new place, with rapidly growing advancements across many fields, economies, and boarders, leaving plenty of room for improvement; while it may feel like a never-ending game of catch up, there are many talented developers and programmers working around the clock to bring us all more sustainable solutions.

The Rise of FPGAs

With the recent implementation of the commonly used coding language C++, and due to their overall flexibility, FPGAs are becoming somewhat more common, especially in larger farms and in industrial setting; but they still remain primarily out of the hands of most mining enthusiasts and almost unheard of to the average hobby miner. Things appear to be changing though, one example of which I’ll discuss below, and it is thought by some, that soon we will see a day when mining with a CPU or GPU just won’t cut it any longer, and the market will be dominated by FPGAs and specialized ASICs, bringing with them efficiency gains for proof of work, while also carelessly leading us all towards the next round of spending.
A perfect real-world example of the effect specialized hardware has had on the crypto-community was recently discovered involving a fairly new project called Verus Coin (https://veruscoin.io/) and a fairly new, relatively more economically accessible FPGA. The FPGA is designed to target specific alt-coins whose algo’s do not require RAM overhead. It was discovered the company had released a new algorithm, kept secret from the public, which could effectively mine Verus at 20x the speed of GPUs, which were the next fastest hardware types mining on the Verus network.
Unfortunately this was done with a deliberately secret approach, calling the Verus algorithm “Algo1” and encouraging owners of the FPGA to never speak of the algorithm in public channels, admonishing a user when they did let the cat out of the bag. The problem with this business model is that it is parasitic in nature. In an ecosystem where advancements can benefit the entire crypto community, this sort of secret mining approach also does not support the philosophies set forth by the Bitcoin or subsequent open source and decentralization movements.
Although this was not done in the spirit of open source, it does hint to an important step in hardware innovation where we could see more efficient specialized systems within reach of the casual miner. The FPGA requires unique sets of data called a bitstream in order to be able to recognize each individual coin’s algorithm and mine them. Because it’s reprogrammable, with the support of a strong development team creating such bitstreams, the miner doesn’t end up with a brick if an algorithm changes.

All is not lost thanks to.. um.. Technology?

Shortly after discovering FPGAs on the network, the Verus developers quickly designed, tested, and implemented a new, much more complex and improved algorithm via a fork that enabled Verus to transition smoothly from VerusHash 1.0 to VerusHash 2.0 at block 310,000. Since the fork, VerusHash 2.0 has demonstrated doing exactly what it was designed for- equalizing hardware performance relative to the device being used while enabling CPUs (the most widely available “ASICs”) to mine side by side with GPUs, at a profit and it appears this will also apply to other specialized hardware. This is something no other project has been able to do until now. Rather than pursue the folly of so many other projects before it- attempting to be “ASIC proof”, Verus effectively achieved and presents to the world an entirely new model of “hardware homogeny”. As the late, great, Bruce Lee once said- “Don’t get set into one form, adapt it and build your own, and let it grow, be like water.”
In the design of VerusHash 2.0, Verus has shown it doesn’t resist progress like so many other new algorithms try to do, it embraces change and adapts to it in the way that water becomes whatever vessel it inhabits. This new approach- an industry first- could very well become an industry standard and in doing so, would usher in a new age for proof of work based coins. VerusHash 2.0 has the potential to correct the single largest design flaw in the proof of work consensus mechanism- the ever expanding monetary and energy requirements that have plagued PoW based projects since the inception of the consensus mechanism. Verus also solves another major issue of coin and net hash centralization by enabling legitimate CPU mining, offering greater coin and hashrate distribution.
Digging a bit deeper it turns out the Verus development team are no rookies. The lead developer Michael F Toutonghi has spent decades in the field programming and is a former Vice President and Technical Fellow at Microsoft, recognized founder and architect of Microsoft's .Net platform, ex-Technical Fellow of Microsoft's advertising platform, ex-CTO, Parallels Corporation, and an experienced distributed computing and machine learning architect. The project he helped create employs and makes use of a diverse myriad of technologies and security features to form one of the most advanced and secure cryptocurrency to date. A brief description of what makes VerusCoin special quoted from a community member-
"Verus has a unique and new consensus algorithm called Proof of Power which is a 50% PoW/50% PoS algorithm that solves theoretical weaknesses in other PoS systems (Nothing at Stake problem for example) and is provably immune to 51% hash attacks. With this, Verus uses the new hash algorithm, VerusHash 2.0. VerusHash 2.0 is designed to better equalize mining across all hardware platforms, while favoring the latest CPUs over older types, which is also one defense against the centralizing potential of botnets. Unlike past efforts to equalize hardware hash-rates across different hardware types, VerusHash 2.0 explicitly enables CPUs to gain even more power relative to GPUs and FPGAs, enabling the most decentralizing hardware, CPUs (due to their virtually complete market penetration), to stay relevant as miners for the indefinite future. As for anonymity, Verus is not a "forced private", allowing for both transparent and shielded (private) transactions...and private messages as well"

If other projects can learn from this and adopt a similar approach or continue to innovate with new ideas, it could mean an end to all the doom and gloom predictions that CPU and GPU mining are dead, offering a much needed reprieve and an alternative to miners who have been faced with the difficult decision of either pulling the plug and shutting down shop or breaking down their rigs to sell off parts and buy new, more expensive hardware…and in so doing present an overall unprecedented level of decentralization not yet seen in cryptocurrency.
Technological advancements led us to the world of secure digital currencies and the progress being made with hardware efficiencies is indisputably beneficial to us all. ASICs and FPGAs aren’t inherently bad, and there are ways in which they could be made more affordable and available for mass distribution. More than anything, it is important that we work together as communities to find solutions that can benefit us all for the long term.

In an ever changing world where it may be easy to lose sight of the real accomplishments that brought us to this point one thing is certain, cryptocurrency is here to stay and the projects that are doing something to solve the current problems in the proof of work consensus mechanism will be the ones that lead us toward our collective vision of a better world- not just for the world of crypto but for each and every one of us.
submitted by Godballz to CryptoTechnology [link] [comments]

The rise of specialized hardware (particularly FPGAs) and its impact on the mining community

The rise of specialized hardware (particularly FPGAs) and its impact on the mining community

Proof of Work (PoW) is one of the most commonly used consensus mechanisms entrusted to secure and validate many of today’s most successful cryptocurrencies, Bitcoin being one. Battle-hardened and having weathered the test of time, Bitcoin has demonstrated the undeniable strength and reliability of the PoW consensus model through sheer market saturation, and of course, its persistency.

In addition to the cost of powerful computing hardware, miners prove that they are benefiting the network by expending energy in the form of electricity, by solving and hashing away complex math problems on their computers, utilizing any suitable tools that they have at their disposal. The mathematics involved in securing proof of work revolve around unique algorithms, each with their own benefits and vulnerabilities, and can require different software/hardware to mine depending on the coin.

Because each block has a unique and entirely random hash, or “puzzle” to solve, the “work” has to be performed for each block individually and the difficulty of the problem can be increased as the speed at which blocks are solved increases.

Hashrates and Hardware Types
While proof of work is an effective means of securing a blockchain, it inherently promotes competition amongst miners seeking higher and higher hashrates due to the rewards earned by the node who wins the right to add the next block. In turn, these higher hash rates benefit the blockchain, providing better security when it’s a result of a well distributed/decentralized network of miners.

When Bitcoin first launched its genesis block, it was mined exclusively by CPUs. Over the years, various programmers and developers have devised newer, faster, and more energy efficient ways to generate higher hashrates; some by perfecting the software end of things, and others, when the incentives are great enough, create expensive specialized hardware such as ASICs (application-specific integrated circuit). With the express purpose of extracting every last bit of hashing power, efficiency being paramount, ASICs are stripped down, bare minimum, hardware representations of a specific coin’s algorithm.

This gives ASICS a massive advantage in terms of raw hashing power and also in terms of energy consumption against CPUs/GPUs, but with significant drawbacks of being very expensive to design/manufacture, translating to a high economic barrier for the casual miner. Due to the fact that they are virtual hardware representations of a single targeted algorithm, this means that if a project decides to fork and change algorithms suddenly, your powerful brand-new ASIC becomes a very expensive paperweight. The high costs in developing and manufacturing ASICs and the associated risks involved, make them unfit for mass adoption at this time.

Somewhere on the high end, in the vast hashrate expanse created between GPU and ASIC, sits the FPGA (field programmable gate array). FPGAs are basically ASICs that make some compromises with efficiency in order to have more flexibility, namely they are reprogrammable and often used in the “field” to test an algorithm before implementing it in an ASIC. As a precursor to the ASIC, FPGAs are somewhat similar to GPUs in their flexibility, but require advanced programming skills and, like ASICs, are expensive and still fairly uncommon.

The Arms Race of the Geek
One of the issues with proof of work incentivizing the pursuit of higher hashrates is in how the network calculates block reward coinbase payouts and rewards miners based on the work that they have submitted. If a coin generated, say a block a minute, and this is a constant, then what happens if more miners jump on a network and do more work? The network cannot pay out more than 1 block reward per 1 minute, and so a difficulty mechanism is used to maintain balance. The difficulty will scale up and down in response to the overall nethash, so if many miners join the network, or extremely high hashing devices such as ASICs or FPGAs jump on, the network will respond accordingly, using the difficulty mechanism to make the problems harder, effectively giving an edge to hardware that can solve them faster, balancing the network. This not only maintains the block a minute reward but it has the added side-effect of energy requirements that scale up with network adoption.

Imagine, for example, if one miner gets on a network all alone with a CPU doing 50 MH/s and is getting all 100 coins that can possibly be paid out in a day. Then, if another miner jumps on the network with the same CPU, each miner would receive 50 coins in a day instead of 100 since they are splitting the required work evenly, despite the fact that the net electrical output has doubled along with the work. Electricity costs miner’s money and is a factor in driving up coin price along with adoption, and since more people are now mining, the coin is less centralized. Now let’s say a large corporation has found it profitable to manufacture an ASIC for this coin, knowing they will make their money back mining it or selling the units to professionals. They join the network doing 900 MH/s and will be pulling in 90 coins a day, while the two guys with their CPUs each get 5 now. Those two guys aren’t very happy, but the corporation is. Not only does this negatively affect the miners, it compromises the security of the entire network by centralizing the coin supply and hashrate, opening the doors to double spends and 51% attacks from potential malicious actors. Uncertainty of motives and questionable validity in a distributed ledger do not mix.

When technology advances in a field, it is usually applauded and welcomed with open arms, but in the world of crypto things can work quite differently. One of the glaring flaws in the current model and the advent of specialized hardware is that it’s never ending. Suppose the two men from the rather extreme example above took out a loan to get themselves that ASIC they heard about that can get them 90 coins a day? When they join the other ASIC on the network, the difficulty adjusts to keep daily payouts consistent at 100, and they will each receive only 33 coins instead of 90 since the reward is now being split three ways. Now what happens if a better ASIC is released by that corporation? Hopefully, those two guys were able to pay off their loans and sell their old ASICs before they became obsolete.

This system, as it stands now, only perpetuates a never ending hashrate arms race in which the weapons of choice are usually a combination of efficiency, economics, profitability and in some cases control.

Implications of Centralization
This brings us to another big concern with expensive specialized hardware: the risk of centralization. Because they are so expensive and inaccessible to the casual miner, ASICs and FPGAs predominantly remain limited to a select few. Centralization occurs when one small group or a single entity controls the vast majority hash power and, as a result, coin supply and is able to exert its influence to manipulate the market or in some cases, the network itself (usually the case of dishonest nodes or bad actors).

This is entirely antithetical of what cryptocurrency was born of, and since its inception many concerted efforts have been made to avoid centralization at all costs. An entity in control of a centralized coin would have the power to manipulate the price, and having a centralized hashrate would enable them to affect network usability, reliability, and even perform double spends leading to the demise of a coin, among other things.

The world of crypto is a strange new place, with rapidly growing advancements across many fields, economies, and boarders, leaving plenty of room for improvement; while it may feel like a never-ending game of catch up, there are many talented developers and programmers working around the clock to bring us all more sustainable solutions.

The Rise of FPGAs
With the recent implementation of the commonly used coding language C++, and due to their overall flexibility, FPGAs are becoming somewhat more common, especially in larger farms and in industrial setting; but they still remain primarily out of the hands of most mining enthusiasts and almost unheard of to the average hobby miner. Things appear to be changing though, one example of which I’ll discuss below, and it is thought by some, that soon we will see a day when mining with a CPU or GPU just won’t cut it any longer, and the market will be dominated by FPGAs and specialized ASICs, bringing with them efficiency gains for proof of work, while also carelessly leading us all towards the next round of spending.

A real-world example of the effect specialized hardware has had on the crypto-community was recently discovered involving a fairly new project called Verus Coin (https://veruscoin.io/) and a fairly new, relatively more economically accessible FPGA. The FPGA is designed to target specific alt-coins whose algo’s do not require RAM overhead. It was discovered the company had released a new algorithm, kept secret from the public, which could effectively mine Verus at 20x the speed of GPUs, which were the next fastest hardware types mining on the Verus network.

Unfortunately this was done with a deliberately secret approach, calling the Verus algorithm “Algo1” and encouraging owners of the FPGA to never speak of the algorithm in public channels, admonishing a user when they did let the cat out of the bag. The problem with this business model is that it is parasitic in nature. In an ecosystem where advancements can benefit the entire crypto community, this sort of secret mining approach also does not support the philosophies set forth by the Bitcoin or subsequent open source and decentralization movements.

Although this was not done in the spirit of open source, it does hint to an important step in hardware innovation where we could see more efficient specialized systems within reach of the casual miner. The FPGA requires unique sets of data called a bitstream in order to be able to recognize each individual coin’s algorithm and mine them. Because it’s reprogrammable, with the support of a strong development team creating such bitstreams, the miner doesn’t end up with a brick if an algorithm changes.

Inclusive Hardware Equalization, Security, Decentralization
Shortly after discovering FPGAs on the network, the Verus developers quickly designed, tested, and implemented a new, much more complex and improved algorithm via a fork that enabled Verus to transition smoothly from VerusHash 1.0 to VerusHash 2.0 at block 310,000. Since the fork, VerusHash 2.0 has demonstrated doing exactly what it was designed for- equalizing hardware performance relative to the device being used while enabling CPUs (the most widely available “ASICs”) to mine side by side with GPUs, at a profit and it appears this will also apply to other specialized hardware. This is something no other project has been able to do until now. Rather than pursue the folly of so many other projects before it- attempting to be “ASIC proof”, Verus effectively achieved and presents to the world an entirely new model of “hardware homogeny”. As the late, great, Bruce Lee once said- “Don’t get set into one form, adapt it and build your own, and let it grow, be like water.”

In the design of VerusHash 2.0, Verus has shown it doesn’t resist progress like so many other new algorithms try to do, it embraces change and adapts to it in the way that water becomes whatever vessel it inhabits. This new approach- an industry first- could very well become an industry standard and in doing so, would usher in a new age for proof of work based coins. VerusHash 2.0 has the potential to correct the single largest design flaw in the proof of work consensus mechanism- the ever expanding monetary and energy requirements that have plagued PoW based projects since the inception of the consensus mechanism. Verus also solves another major issue of coin and net hash centralization by enabling legitimate CPU mining, offering greater coin and hashrate distribution.

If other projects adopt Verus’ new algorithm- VerusHash 2.0, it could mean an end to all the doom and gloom predictions that CPU and GPU mining are dead, offering a much needed reprieve and an alternative to miners who have been faced with the difficult decision of either pulling the plug and shutting down shop or breaking down their rigs to sell off parts and buy new, more expensive hardware…and in so doing presents an overall unprecedented level of decentralization not seen in cryptocurrency.

Technological advancements led us to the world of secure digital currencies and the progress being made with hardware efficiencies is indisputably beneficial to us all. ASICs and FPGAs aren’t inherently bad, and there are ways in which they could be made more affordable and available for mass distribution. More than anything, it is important that we work together as communities to find solutions that can benefit us all for the long term.

In an ever changing world where it may be easy to lose sight of the real accomplishments that brought us to this point one thing is certain, VerusHash 2.0 is a shining beacon of hope and a lasting testament to the project’s unwavering dedication to it’s vision of a better world- not just for the world of crypto but for each and every one of us.
submitted by Godballz to CryptoTechnology [link] [comments]

How To Build Crypto Mining Rig W/ $2000 or LESS - Beginner ... How To Build a Crypto GPU Mining Rig With $1000 or Less ... How To Build The Cheapest Mining Rig Possible! - YouTube $80,000 Mining Rig Interview - 70x 1080tis! - YouTube The BEST GPU & CPU Mining Rig Build Guide 2019  Windows ...

Bitcoin mining still remains one of the best ways to make a profit in the crypto industry, although it is not exactly easy to do it by yourself anymore. With the demand being as great as it is, it is much easier to simply join one of the Bitcoin mining pools and help out, rather than try to win the block for yourself.. But, no matter what you try to do, you need two things in order to do it ... There's more than one way to make money from recent Bitcoin craze, which has seen the value of the digital currency increase more than six-fold over the past few months. Yes, you can do it the old ... If you were wondering how to build an affordable mining rig step by step, then this 1050ti 4GB eight GPU rig build makes a great option. Not only do these GPUs make a great low-cost mining rig but their very energy efficient as well. The 1050 ti cards can also mine a wide variety of cryptocurrencies like Ethereum and Ravencoin to name just a ... An ASIC-based system recently developed by specialist mining rig firm Avalon has a 66,300 Mhash/sec capacity with a 620-watt overhead, equating to 106 Mhash/watt. Read updated ASIC bitcoin miner ... The fastest and more efficient mining hardware is going to cost more. Don’t try to buy a miner based on only price or only hash rate. The best ASIC miner is the most efficient bitcoin miner. Aim for value. Bitcoin Miners for Sale on eBay or Amazon. If you’re a hobby miner who wants to buy a couple rigs for your house, eBay and Amazon both have some decent deals on mining hardware. Used ...

[index] [8182] [48000] [33339] [31692] [2926] [26431] [23239] [9321] [1457] [4401]

How To Build Crypto Mining Rig W/ $2000 or LESS - Beginner ...

Here's the first part in the series on how to build the best RGB-laden (or not, it's a free country) mining rig possible. We got part selection up first toda... PRESTIGEMINING.US Custom 4U Server Rack Mining Rig Ethereum, ZCASH (8) EVGA GeForce GTX 1060 6GB 8 GB DDR4 RAM 120 GB M.2 SSD 1600 watt PSU (7) 120mm Cooling... Vosk reviews how to build the best cheap beginner crypto GPU mining rig in a few easy steps! Anyone can build this Duo Mining Rig for less than $1000 with no... Want to build a GPU+CPU Mining Rig that earns passive income with Cryptocurrency? Look no further! Best mining rig build of 2019! Subscribe to VoskCoin - h... Most people view building a mining rig as an expensive or confusing thing to do. However, we break down what exactly you need for your mining rig & how to do...

#