每天推荐一个 GitHub 优质开源项目和一篇精选英文科技或编程文章原文,欢迎关注开源日报。交流QQ群:202790710;电报群 https://t.me/OpeningSourceOrg


今日推荐开源项目:《野生计科教材  Computer Science》GitHub 地址:https://github.com/ossu/computer-science

推荐理由:想学电脑科学吗?这里有一个开源项目就是协作的资讯科学教材,特别适用于计科专业。教程并不局限于职业生涯或专业发展,也提供给那些想要关于计算机学科恰当的、面面俱到的基础的人,同时给那些有强烈愿望、兴趣想自己完成这个教育的人,提供一起学习讨论世界性范围的社区。

开源周报2018年第2期:Hyperapp领风骚,Python带你跳一跳

 

这些课程主要来自哈佛、普林斯顿、麻省理工等等,并被精挑细选出符合下列标准的教材:

  1. 对所有注册者开放;
  2. 定期进行;
  3. 符合OSSU的学术标准;
  4. 从普通到困难无缝衔接;
  5. 高质量且适于教学;

注意事项:

  1. 如果你每周投入18-22小时学习,你大概可以在两年内学完
  2. 除开一小部分课程,大部分课程都是免费的
  3. 不要违背你在课程开始前签的条约
  4. 常见问题:https://github.com/ossu/computer-science/blob/master/FAQ.md
  5. 不常见问题来论坛:https://www.reddit.com/r/opensourcesociety/

课前准备:

  1. 高中数学、物理基础
  2. 知道学哪个

学习过程:

  1. 学习
  2. 欢迎团队合作
  3. 有期末考试(Final project)(可以使用任何语言)(为了让你学以致用,把知识运用到解决现实问题上)
  4. 学完后的评估(由老师、你的同伴、和有经验的人进行)
  5. 学习结束后你可以选择去工作,或阅读专业书籍再丰富自己,或参加当地同好者组织,以及关注世界上软件的发展。

今日推荐英文原文:《How integrated, open infrastructure meets tomorrow's computing demands》

作者:     原文链接:https://opensource.com/article/18/5/open-infrastructure

推荐理由:整合的开放式基础架构如何满足未来的计算需求呢?开放式基础架构的优点在于其灵活性:可以根据每个组织的特定需求添加解决方案,可以说是非常迷人了。

How integrated, open infrastructure meets tomorrow's computing demands

Cloud infrastructure is quickly becoming an integral business component for nearly every company.

By virtualizing physical compute, networking, and storage resources, the cloud model of computing makes it possible to transform data center resources—previously limited, expensive and difficult to provision for users in a timely way—into flexible, elastic and easily consumable resources. Applications developers love cloud because the infrastructure resources are available on demand and are programmatically accessible via APIs, which helps them innovate more quickly; they no longer have to worry about building or requisitioning the infrastructure to test and deploy their applications—they just consume it! Operators love cloud because it enables them to deliver more powerful services faster, more cost-efficiently, more securely and with high degrees of automation. CIOs love cloud because it leads to better utilization of resources and a reduction in total cost of ownership for infrastructure. CEOs love cloud because it makes their organizations more agile—able to innovate faster and respond more quickly to set themselves apart in a competitive marketplace.

Needless to say, the cloud is popular for good reason, and enterprising companies are taking the lead in meeting the market demand with public cloud services. Three companies, in particular, have set themselves apart: Amazon Web Services, Microsoft Azure, and Google Cloud Platform together account for half of the global public cloud market. Increasingly, however, voices are being raised against the inevitability of a future in which all of the world's infrastructure needs will be met by "The Big 3."In fact, many large companies opt to deploy their own infrastructure rather than pay for the high margins of those Internet giants (who use those margins to finance their aggressive extension to new areas). Instead of essentially funding their own long-term extinction by paying for public cloud services, companies like Walmart, Best Buy, eBay, Paypal, AT&T, or Comcast have chosen a different path for their infrastructure needs.

And they are not alone. Others have made the choice to resist the hegemony of The Big 3 by offering their own public cloud alternative in the form of local and interoperable clouds. This is the case with OVH in France, Deutsche Telekom in Germany, City Network in Sweden, and VEXXHOST in Canada.

Likewise, in the area of scientific research, we find organizations where high-performance computing power is essential and every dollar must be used to the best advantage, and these institutions are opting to build their own infrastructure. CERN deployed an impressive private cloud (300,000 CPU cores, 800 Tb of RAM) to cost-effectively analyze the results of the Large Hadron Collider and other experiments. It's now collaborating with other research institutions to build the compute and storage infrastructure for the Square Kilometer Array, an Earth-sized telescope expected to produce petabytes of data every day.

In China, where everything is oversized and there is an inherent mistrust of American suppliers, the choice to build your own infrastructure for both enterprise and government use is a natural one. We already find such infrastructure supporting trains for China Railway, the payment system for China UnionPay, and telephone services for millions of China Mobile customers.

To build their own infrastructure, these organizations are making a wise and strategic investment in the alternative to The Big 3: open infrastructure—combining the best programmable, flexible and interoperable open source software to build their own infrastructure. OpenStack provides the basis for this infrastructure, on which it is possible to deploy further cloud-native open source solutions like Kubernetes. Although developed by adjacent communities, these two solutions are being tested in collaboration with the various actors of the open infrastructure community to ensure compatibility.

Open infrastructure makes good business sense. Leveraging openly developed, free, and open source software, organizations can:

  • Avoid vendor lock-in;
  • "See under the hood" and fully understand how it all works and how to optimize the system;
  • Develop innovative solutions to meet your own unique circumstances; and
  • Participate directly in the improvement of the software so that it will serve you even better in the future.

The beauty of open infrastructure is its flexibility: solutions can be added according to each organization's particular needs. Finally, the use of open solution standards makes the components of open infrastructure interoperable. The same APIs can be used on different deployments, allowing peer organizations to federate resources among different cloud environments, even to public clouds that have made the same choice of open infrastructure or use the same APIs.

The era of open infrastructure is just beginning. Already, artificial intelligence (AI) is booming, with AI applications infiltrating more and more industries, leading to a rapidly increasing demand for compute-intensive infrastructure resources. Tomorrow, mobile applications (such as augmented reality applications) will require even more processing capacity and far less network latency, forcing organizations to push processing capacity as close as possible to the user who consumes it, i.e., to “the edge.” Thus we are witnessing the advent of edge computing, a bursting out of the traditional data center model to a more decentralized, distributed and efficient model. Only open infrastructure will meet the challenges of standardization and interoperability that this revolution will bring.

 


每天推荐一个 GitHub 优质开源项目和一篇精选英文科技或编程文章原文,欢迎关注开源日报。交流QQ群:202790710;电报群 https://t.me/OpeningSourceOrg