每天推薦一個 GitHub 優質開源項目和一篇精選英文科技或編程文章原文,歡迎關注開源日報。交流QQ群:202790710;電報群 https://t.me/OpeningSourceOrg


今日推薦開源項目:《野生計科教材  Computer Science》GitHub 地址:https://github.com/ossu/computer-science

推薦理由:想學電腦科學嗎?這裡有一個開源項目就是協作的資訊科學教材,特別適用於計科專業。教程並不局限於職業生涯或專業發展,也提供給那些想要關於計算機學科恰當的、面面俱到的基礎的人,同時給那些有強烈願望、興趣想自己完成這個教育的人,提供一起學習討論世界性範圍的社區。

開源周報2018年第2期:Hyperapp領風騷,Python帶你跳一跳

 

這些課程主要來自哈佛、普林斯頓、麻省理工等等,並被精挑細選出符合下列標準的教材:

  1. 對所有註冊者開放;
  2. 定期進行;
  3. 符合OSSU的學術標準;
  4. 從普通到困難無縫銜接;
  5. 高質量且適於教學;

注意事項:

  1. 如果你每周投入18-22小時學習,你大概可以在兩年內學完
  2. 除開一小部分課程,大部分課程都是免費的
  3. 不要違背你在課程開始前簽的條約
  4. 常見問題:https://github.com/ossu/computer-science/blob/master/FAQ.md
  5. 不常見問題來論壇:https://www.reddit.com/r/opensourcesociety/

課前準備:

  1. 高中數學、物理基礎
  2. 知道學哪個

學習過程:

  1. 學習
  2. 歡迎團隊合作
  3. 有期末考試(Final project)(可以使用任何語言)(為了讓你學以致用,把知識運用到解決現實問題上)
  4. 學完後的評估(由老師、你的同伴、和有經驗的人進行)
  5. 學習結束後你可以選擇去工作,或閱讀專業書籍再豐富自己,或參加當地同好者組織,以及關注世界上軟體的發展。

今日推薦英文原文:《How integrated, open infrastructure meets tomorrow's computing demands》

作者:     原文鏈接:https://opensource.com/article/18/5/open-infrastructure

推薦理由:整合的開放式基礎架構如何滿足未來的計算需求呢?開放式基礎架構的優點在於其靈活性:可以根據每個組織的特定需求添加解決方案,可以說是非常迷人了。

How integrated, open infrastructure meets tomorrow's computing demands

Cloud infrastructure is quickly becoming an integral business component for nearly every company.

By virtualizing physical compute, networking, and storage resources, the cloud model of computing makes it possible to transform data center resources—previously limited, expensive and difficult to provision for users in a timely way—into flexible, elastic and easily consumable resources. Applications developers love cloud because the infrastructure resources are available on demand and are programmatically accessible via APIs, which helps them innovate more quickly; they no longer have to worry about building or requisitioning the infrastructure to test and deploy their applications—they just consume it! Operators love cloud because it enables them to deliver more powerful services faster, more cost-efficiently, more securely and with high degrees of automation. CIOs love cloud because it leads to better utilization of resources and a reduction in total cost of ownership for infrastructure. CEOs love cloud because it makes their organizations more agile—able to innovate faster and respond more quickly to set themselves apart in a competitive marketplace.

Needless to say, the cloud is popular for good reason, and enterprising companies are taking the lead in meeting the market demand with public cloud services. Three companies, in particular, have set themselves apart: Amazon Web Services, Microsoft Azure, and Google Cloud Platform together account for half of the global public cloud market. Increasingly, however, voices are being raised against the inevitability of a future in which all of the world's infrastructure needs will be met by "The Big 3."In fact, many large companies opt to deploy their own infrastructure rather than pay for the high margins of those Internet giants (who use those margins to finance their aggressive extension to new areas). Instead of essentially funding their own long-term extinction by paying for public cloud services, companies like Walmart, Best Buy, eBay, Paypal, AT&T, or Comcast have chosen a different path for their infrastructure needs.

And they are not alone. Others have made the choice to resist the hegemony of The Big 3 by offering their own public cloud alternative in the form of local and interoperable clouds. This is the case with OVH in France, Deutsche Telekom in Germany, City Network in Sweden, and VEXXHOST in Canada.

Likewise, in the area of scientific research, we find organizations where high-performance computing power is essential and every dollar must be used to the best advantage, and these institutions are opting to build their own infrastructure. CERN deployed an impressive private cloud (300,000 CPU cores, 800 Tb of RAM) to cost-effectively analyze the results of the Large Hadron Collider and other experiments. It's now collaborating with other research institutions to build the compute and storage infrastructure for the Square Kilometer Array, an Earth-sized telescope expected to produce petabytes of data every day.

In China, where everything is oversized and there is an inherent mistrust of American suppliers, the choice to build your own infrastructure for both enterprise and government use is a natural one. We already find such infrastructure supporting trains for China Railway, the payment system for China UnionPay, and telephone services for millions of China Mobile customers.

To build their own infrastructure, these organizations are making a wise and strategic investment in the alternative to The Big 3: open infrastructure—combining the best programmable, flexible and interoperable open source software to build their own infrastructure. OpenStack provides the basis for this infrastructure, on which it is possible to deploy further cloud-native open source solutions like Kubernetes. Although developed by adjacent communities, these two solutions are being tested in collaboration with the various actors of the open infrastructure community to ensure compatibility.

Open infrastructure makes good business sense. Leveraging openly developed, free, and open source software, organizations can:

  • Avoid vendor lock-in;
  • "See under the hood" and fully understand how it all works and how to optimize the system;
  • Develop innovative solutions to meet your own unique circumstances; and
  • Participate directly in the improvement of the software so that it will serve you even better in the future.

The beauty of open infrastructure is its flexibility: solutions can be added according to each organization's particular needs. Finally, the use of open solution standards makes the components of open infrastructure interoperable. The same APIs can be used on different deployments, allowing peer organizations to federate resources among different cloud environments, even to public clouds that have made the same choice of open infrastructure or use the same APIs.

The era of open infrastructure is just beginning. Already, artificial intelligence (AI) is booming, with AI applications infiltrating more and more industries, leading to a rapidly increasing demand for compute-intensive infrastructure resources. Tomorrow, mobile applications (such as augmented reality applications) will require even more processing capacity and far less network latency, forcing organizations to push processing capacity as close as possible to the user who consumes it, i.e., to 「the edge.」 Thus we are witnessing the advent of edge computing, a bursting out of the traditional data center model to a more decentralized, distributed and efficient model. Only open infrastructure will meet the challenges of standardization and interoperability that this revolution will bring.

 


每天推薦一個 GitHub 優質開源項目和一篇精選英文科技或編程文章原文,歡迎關注開源日報。交流QQ群:202790710;電報群 https://t.me/OpeningSourceOrg