Docker Timing Out Issue 182 Docker/for-mac Github

Docker Timing Out Issue 182 Docker/for-mac Github Average ratng: 3,6/5 5451 votes
  1. Docker Timing Out Issue 182 Docker/for-mac Github Download
  2. Docker Timing Out Issue 182 Docker/for-mac Github Pdf
  3. Docker Timing Out Issue 182 Docker/for-mac Github.com

The system was working properly on the toolbox but now after the upgrade and removal of toolbox I'm getting timeouts or very slow builds. I have the same problem. I work with docker under OS X El Capitan. 7: Pulling from library/java 357ea8c3d80b: Pull complete 52befadefd24: Pull complete 3c0732d5313c: Pull complete 557cb7f84eb9: Pull complete 7bbd9fac5727: Pull complete 15f5ec8580f1: Downloading 97.84 MB/139.6 MB It downloads some parts pretty quick. Then it gets stuck. I'm using: Docker version 1.11.2, build b9f10c9 docker-compose version 1.7.1, build I also modified my /etc/hosts without success.

I'm having a similar issue to; my docker-compose commands eventually complete, but they take an astoundingly long time - commands that used to be instantaneous, such as logs or up, take a minute or more. I'm using Docker for Mac and the issue began happening after an upgrade to 1.12.1-beta25 (build: 11807). The weird thing is, this slowness only impacts Compose; if I use docker from the OS X CLI, it's just as zippy as ever. So, there's some sort of bad interaction between Compose and Docker for Mac (or its numerous proxy processes and other guest-to-host glue.) Compose version info. Nevermind; the original Linux/Mac test was invalid; I didn't notice that the Linux machine ended up pulling some images. With a trivial project file, our apples-to-apples comparison gives us: # OS X talking to Docker via docker.sock time docker-compose -verbose up -d test 56.63 real 0.27 user 0.06 sys # OSX talking to remote daemon via tcp time docker-compose -verbose up -d test 6.15 real 0.24 user 0.05 sys The -verbose output and its timestamps confirm the pattern that I saw at first: about 5 seconds elapse between blocks of activity relating to docker requests. New output attached, with 100% fewer secrets.

One more salient detail: I just switched locations, and also switched network interfaces from Ethernet (with WiFi disabled) to my Mac's built-in WiFi interface. The performance problem has disappeared. This is very good evidence that the root cause is some weird interaction between Docker for Mac's VM, its networking glue, and my host - and that the cause is either related to routing or interface snooping. In any case, it's obvious that my particular slowness issue isn't closely related to Docker Compose.

If I discover anything noteworthy I'll file a ticket with the DfM team.

Post Syndicated from original While millions of people have died throughout history fighting for the right to vote, there is a significant wave of apathy among large swathes of the population in democracies where the ballot box is taken for granted. With this in mind, an educational project in the Czech Republic aims to familiarize high school students with basic democratic principles, acquaint them with the local electoral system, while promoting dialog among students, teachers, and parents. The main goal is to increase participation of young people in elections. In line with this project, young students across the country are invited to take part in a simulated general election, to get a taste of what things will be like when they reach voting age. This year, these Student Elections took place over two days starting October 3 in secondary schools across the Czech Republic. Under the One World Education Program at, a nonprofit that implements educational and human rights programs in crisis zones, 40,068 students from 281 schools cast their votes for political parties, movements and coalition candidates standing for the in the upcoming real elections.

Students 15-years-old and above were eligible to vote and when they were all counted, the results were quite something for any follower of the worldwide Pirate Party movement. Of all groups, the Czech Pirate Party won a decisive victory, netting 24.5% of the overall vote, double that achieved by the ANO movement (11.9%) and the right-wing TOP 09 (11.8%). The fourth and fifth-placed candidates topped out at 7.76% and 6.33% respectively. “The results of the Student Elections will be compared to the results of the election in a couple of weeks. It is certain they will vary greatly,” says Karel Strachota, director of the One World at School Education Program and the person who launched the Student Election project seven years ago. “At the same time, however, the choice of students seems to indicate a certain trend in the development of voter preferences.

From our teachers and school visits, we know that, as in the past, most of the pupils have been able to choose responsibly.” According to, opinion polls for the upcoming election (October 20-21) place the ANO movement as the clear favorites, with the Pirates having “a big chance to succeed” with up to. Given the results of the simulation, elections in coming years could be something really special for the Pirates. The full results of the Student Elections 2017 can be found on the One World website. Meanwhile, Czech Pirate Party President Ivan Bartos sings to voters in the pre-election video below, explaining why Pirates are needed in Parliament in 2017. Source:, for the latest info on copyright, file-sharing,. Early Bird Tickets Now Available We’ve released a limited number of Early Bird tickets before General Admission tickets are available.

Take advantage of this discount before they’re sold out! Interested in speaking at GrafanaCon? We’re looking for technical and non-tecnical talks of all sizes.

From the Blogosphere: Microsoft recently announced the ability to access a subset of Azure Cosmos DB metrics via Azure Monitor API. Grafana Labs built an for Grafana 4.5 to visualize the data.: Brian was tired of guessing about the performance of his development machines and test environment. Here, he shows how to monitor Docker with Prometheus to get a better understanding of a dev environment in his quest to monitor all the things.: This article covers enokido’s process of choosing a monitoring platform. He identifies three possible solutions, outlines the pros and cons of each, and discusses why he chose Prometheus.: It’s fascinating to see Grafana dashboards with production data from companies around the world. For instance, we’ve previously highlighted the huge number of dashboards publicly shares. This week, we found that GitLab also has public dashboards to explore.: It’s important to know the state of your applications in a scalable environment such as Docker Swarm. This video covers an overview of Docker, VM’s vs.

Containers, orchestration and how to monitor Docker Swarm.: Learn how to use counters from mulitple disparate sources, devices, operating systems, and applications to generate actionable time series data.: This video demo shows off some of the upcoming features for OFPSniffer, an OpenFlow sniffer to help network troubleshooting in production networks. Grafana Plugins Plugin authors add new features and bugfixes all the time, so it’s important to always keep your plugins up to date. To update plugins from on-prem Grafana, use the, if you are using, you can update with 1 click! If you have questions or need help, hit up our, where the Grafana team and members of the community are happy to help.

UPDATED PLUGIN BT Plugins – Our friends at BT have been busy. All of the BT plugins in our catalog received and update this week. The plugins are the, the, the and the. Changes include:. Custom dashboard links now work in Internet Explorer.

The Peak Report panel no longer supports click-to-sort. The Status Dot panel tooltips now look like Grafana tooltips.

This week’s MVC (Most Valuable Contributor) Each week we highlight some of the important contributions from our amazing open source community. This week, we’d like to recognize a contributor who did a lot of work to improve Prometheus support. Post Syndicated from original Many customers use to ingest, analyze, and persist their streaming data. One of the easiest ways to gain real-time insights into your streaming data is to use. It enables you to query the data in your stream or build entire streaming applications using SQL. Customers use Kinesis Analytics for things like filtering, aggregation, and anomaly detection. Kinesis Analytics now gives you the option to preprocess your data with.

This gives you a great deal of flexibility in defining what data gets analyzed by your Kinesis Analytics application. You can also define how that data is structured before it is queried by your SQL. In this post, I discuss some common use cases for preprocessing, and walk you through an example to help highlight its applicability.

Common use cases There are many reasons why you might choose to preprocess data before starting your analysis. Because you build your preprocessing logic with Lambda, your preprocessor can do anything supported by Lambda. However, there are some specific use cases that lend themselves well to preprocessing, such as data enrichment and data transformation.

The walking dead issue 182 bill

Enrichment In some scenarios, you may need to enhance your streaming data with additional information, before you perform your SQL analysis. Kinesis Analytics gives you the ability to use data from Amazon S3 in your Kinesis Analytics application, using the Reference Data feature. However, you cannot use other data sources from within your SQL query. To add dynamic data to your streaming data, you can preprocess with a Lambda function, and retrieve the data from the data store of your choosing. For example, consider a scenario where you’re streaming some data about users of your application, including their IP address.

You want to do some real-time analysis on the geographic locations of your customers. In this example, your preprocessing Lambda function uses your data source for geolocation information to retrieve the user’s city, state or province, and country, based on the IP address that was included in the streaming record.

You then enrich the record with that information and now your SQL query can use those attributes in its aggregation. Transformation Because Kinesis Analytics uses SQL to analyze your data, the structure of your streaming records must be mapped to a schema. If your records are JSON or CSV, Kinesis Analytics automatically creates a schema.

However, if your JSON records contain complex nested arrays, you may need to customize how the record structure is mapped to a flattened schema. Further, Kinesis Analytics is unable to automatically parse formats such as GZIP, protobuf, or Avro. If your input records are unstructured text, Kinesis Analytics creates a schema, but it consists of a single column representing your entire record. To remedy these complexities, use Lambda to transform and convert your streaming data so that it more easily maps to a schema that can be queried by the SQL in your Kinesis Analytics application.

Assume that you’re streaming raw Apache access log data from a web fleet to a Kinesis stream, and you want to use Kinesis Analytics to detect anomalies in your HTTP response codes. In this example, you want to detect when your stream contains an unusually large number of 500 response codes. This may indicate that something has gone wrong somewhere in your application, and as a result, Apache is returning 500 responses to clients. This is typically not a good customer experience. An example Apache access log record looks like this. 231.55.150.184 -28/Sep/2017:11:18:59 -0400 'PUT /explore HTTP/1.1' 200 2742 '-' 'Mozilla/5.0 (Windows; U; Windows NT 6.3) AppleWebKit/538.0.1 (KHTML, like Gecko) Chrome/20.0.872.0 Safari/538.0.1' Although its structure is well-defined, it is not JSON or CSV, so it doesn’t map to a defined schema in Kinesis Analytics. To use Kinesis Analytics with raw Apache log records, you can transform them to JSON or CSV with a preprocessing Lambda function.

For example, you can convert it to a simple JSON documents that easily maps to a schema. Post Syndicated from original Backblaze just ordered a 100 petabytes’ worth of hard drives, and yes, we’ll use nearly all of them in Q4.

In fact, we’ll begin the process of sourcing the Q1 hard drive order in the next few weeks. What are we doing with all those hard drives? Let’s take a look. Our First 10 Petabyte Backblaze Vault Ken clicked the submit button and 10 Petabytes of Backblaze Cloud Storage came online ready to accept customer data.

Ken (aka the Pod Whisperer), is one of our Datacenter Operations Managers at Backblaze and with that one click, he activated Backblaze Vault 1093, which was built with 1,200 Seagate 10 TB drives (model: ST10000NM0086). After formatting and configuration of the disks, there is 10.12 Petabytes of free space remaining for customer data. Back in 2011, when Ken started at Backblaze, he was amazed that we had amassed as much as. The Seagate 10 TB drives we deployed in vault 1093 are helium-filled drives. We had previously deployed 45 HGST 8 TB where we learned one of the benefits of using helium drives — they consume less power than traditional air-filled drives. Here’s a quick comparison of the power consumption of several high-density drive models we deploy.

“100 Petabytes should get us through Q4.” — Tim Nufire, Chief Cloud Officer, Backblaze The 1,200 Seagate 10 TB drives are just the beginning. The next Backblaze Vault will be configured with 12 TB drives which will give us 12.2 petabytes of storage in one vault. We are currently building and adding two to three Backblaze Vaults a month to our cloud storage system, so we are going to need more drives. When we did all of our “drive math,” we decided to place an order for 100 petabytes of hard drives comprised of 10 and 12 TB models. Gleb, our CEO and, exhaled mightily as he signed the biggest purchase order in company history. Wait until he sees the one for Q1.

400 Petabytes of Cloud Storage When we added Backblaze Vault 1093, we crossed over 400 Petabytes of total available storage. For those of you keeping score at home, we reached 350 Petabytes about 3 months ago as you can see in the chart below. Backblaze Vault Primer All of the storage capacity we’ve added in the last two years has been on our, with vault 1093 being the 60th one we have placed into service. Each Backblaze Vault is comprised of 20 Backblaze Storage Pods logically grouped together into one storage system.

Today, each Storage Pod contains sixty 3 ½” hard drives, giving each vault 1,200 drives. Early vaults were built on Storage Pods with 45 hard drives, for a total of 900 drives in a vault. A Backblaze Vault accepts data directly from an authenticated user.

Each data blob (object, file, group of files) is divided into 20 shards (17 data shards and 3 parity shards) using our. Each of the 20 shards is stored on a different Storage Pod in the vault. At any given time, several vaults stand ready to receive data storage requests. Drive Stats for the New Drives In our Q3 2017 Drive Stats report, due out in late October, we’ll start reporting on the 10 TB drives we are adding. It looks like the 12 TB drives will come online in Q4.

We’ll also get a better look at the 8 TB consumer and enterprise drives we’ve been following. Other Big Data Clouds We have always been transparent here at Backblaze, including about how much data we store, how we store it, even.

Very few others do the same. But, if you have information on how much data a company or organization stores in the cloud, let us know in the comments. Please include the source and make sure the data is not considered proprietary. If we get enough tidbits we’ll publish a “big cloud” list. The post appeared first on.

Post Syndicated from original Join us in San Francisco at the AWS Pop-up Loft for AWS IAM Day on Monday, October 9, from 9:30 A.M.–4:15 P.M. Pacific Time. At this free technical event, you will learn (IAM) concepts from IAM product managers, as well as tools and strategies you can use for controlling access to your AWS environment, such as the IAM policy language and IAM best practices. You also will take an IAM policy ninja dive deep into permissions and how to use IAM roles to delegate access to your AWS resources.

Last, you will learn how to integrate Active Directory with AWS workloads. You can attend one session or stay for the full day. Learn more about the available sessions! Post Syndicated from original October has come at last, and with it, the joy of Halloween is now upon us. So while I spend the next 30 days quoting Hocus Pocus at every opportunity, here’s Adafruit’s latest spooky build the spooktacular.

Haunted Portraits If you’ve visited a haunted house such as Disney’s Haunted Mansion, or walked the halls of Hogwarts at Universal Studios, you will have seen a ‘moving portrait’. Whether it’s the classic ‘did that painting just blink?’ approach, or occupants moving in and out of frame, they’re an effective piece of spooky decoration – and now you can make your own! Adafruit’s AdaBox John Park, maker extraordinaire, recently posted a live make video where he used the contents of the Raspberry Pi-themed AdaBox 005 to create a blinking portrait. The Adabox is Adafruit’s own maker subscription service where plucky makers receive a mystery parcel containing exciting tech and inspirational builds.

Their more recent delivery, contains a Raspberry Pi Zero, their own Joy Bonnet, a case, and peripherals, including Pimoroni’s no-solder Hammer Headers. While you can purchase the AdaBoxes as one-off buys, subscribers get extra goodies. With AdaBox 005, they received bonus content including Raspberry Pi swag in the form of stickers, and a copy of The MagPi Magazine. The contents of AdaBox 005 allows makers to build their own Raspberry Pi Zero tiny gaming machine. But the ever-working minds of the Adafruit team didn’t want to settle there, so they decided to create more tutorials based on the box’s contents, such as. Bringing a portrait to life Alongside the AdaBox 005 content, all of which can be purchased from Adafruit directly, you’ll need a flat-screen monitor and a fancy frame.

The former could be an old TV or computer screen while the latter, unless you happen to have an ornate frame that perfectly fits your monitor, can be made from cardboard, CNC-cut wood or gold-painted macaroni and tape probably. You’ll need to attach headers to your Raspberry Pi Zero. For those of you who fear the soldering iron, the Hammer Headers can be hammered into place without the need for melty hot metal.

If you’d like to give soldering a go, you can follow Laura’s. In his tutorial, John goes on to explain how to set up the Joy Bonnet (if you wish to use it as an added controller), set your Raspberry Pi to display in portrait mode, and manipulate an image in Photoshop or GIMP to create the blinking effect. Blinking eyes are just the start of the possibilities for this project. This is your moment to show off your image manipulation skills!

Why not have the entire head flash to show the skull within? Or have an ethereal image appear in the background of an otherwise unexceptional painting of a bowl of fruit? In the final stages of the tutorial, John explains how to set an image slideshow running on the Pi, and how to complete the look with the aforementioned ornate frame. He also goes into detail about the importance of using a matte effect screen or transparent gels to give a more realistic ‘painted’ feel. You’ll find everything you need to make your own haunted portrait, including a link to John’s entire live stream. We’re going to make this for Pi Towers.

In fact, I’m wondering whether I could create an entire gallery of portraits specifically for our reception area and see how long it takes people to notice though I possibly shouldn’t have given my idea away on this rather public blog post. If you make the Haunted Portrait, or any other Halloween-themed Pi build, make sure you share it with us, or in the comments below. The post appeared first on.

Post Syndicated from original After years of accepting donations via Bitcoin, last month various ‘pirate’ sites began to generate digital currency revenues in a brand new way. It all began with The Pirate Bay, which a Javascript cryptocurrency miner to its main site, something that first manifested itself as a large spike in CPU utilization on the machines of visitors. The stealth addition to the platform, which its operators later described as a test, was extremely controversial. While many thought of the miner as a way to in a secure fashion, a vocal majority expressed a preference for permission being requested first, in case they didn’t want to participate in the program.

Over the past couple of weeks, several other sites have added similar miners, some which ask permission to run and others that do not. While the former probably aren’t considered problematic, the latter are now being viewed as a serious problem by an unexpected player in the ecosystem. TorrentFreak has learned that popular CDN service Cloudflare, which is often criticized for not being harsh enough on ‘pirate’ sites, is actively suspending the accounts of sites that deploy cryptocurrency miners on their platforms. “Cloudflare kicked us from their service for using a Coinhive miner,” the operator of informed TF this morning.

ProxyBunker is a site that that links to several other domains that offer unofficial proxy services for the likes of The Pirate Bay, RARBG, KickassTorrents, Torrentz2, and dozens of other sites. It first tested a miner for four days starting September 23. Official implementation began October 1 but was ended last evening, abruptly. “Late last night, all our domains got deleted off Cloudflare without warning so I emailed Cloudflare to ask what was going on,” the operator explained. Bye bye As the email above shows, Cloudflare cited only a “possible” terms of service violation. Further clarification was needed to get to the root of the problem. So, just a few minutes later, the site operator contacted Cloudflare, acknowledging the suspension but pointing out that the notification email was somewhat vague and didn’t give a reason for the violation.

A follow-up email from Cloudflare certainly put some meat on the bones. “Multiple domains in your account were injecting Coinhive mining code without notifying users and without any option to disabling sic the mining,” wrote Justin Paine, Head of Trust & Safety at Cloudflare. “We consider this to be malware, and as such the account was suspended, and all domains removed from Cloudflare.” Cloudflare: Unannounced miners are malware ProxyBunker’s operator wrote back to Cloudflare explaining that the Coinhive miner had been running on his domains but that his main domain had a way of disabling mining, as per new code made available from Coinhive.

“We were running the miner on our proxybunker.online domain using Coinhive’s new Javacode that lets the user stop the miner at anytime and set the CPU speed it mines at,” he told TF. Nevertheless, some element of the configuration appears to have fallen short of Cloudflare’s standards. So, shortly after Cloudflare’s explanation, the site operator asked if he could be reinstated if he completely removed the miner from his site. The response was a ‘yes’ but with a stern caveat attached. “We will remove the account suspension, however do note you’ll need to re-sign up the domains as they were removed as a result of the account suspension. Please note — if we discover similar activity again the domains and account will be permanently blocked,” Cloudflare’s Justin warned. ProxyBunker’s operator says that while he sees the value in cryptocurrency miners, he can understand why people might be opposed to them too.

That being said, he would appreciate it if services like Cloudflare published clear guidelines on what is and is not acceptable. “We do understand that most users will not like the miner using up a bit of their CPU but we do see the full potential as a new revenue stream,” he explains. “I think third-party services need to post clear information that they’re not allowed on their services, if that’s the case.” At time of publication, Cloudflare had not responded to TorrentFreak’s requests for comment. Source:, for the latest info on copyright, file-sharing,. Post Syndicated from original For passive projects such as point-of-sale displays, video loopers, and your upcoming Halloween builds, Adafruit have come up with a read-only solution for powering down your Raspberry Pi without endangering your SD card. Pulling the plug At home, at a coding club, or at a Jam, you rarely need to pull the plug on your Raspberry Pi without going through the correct shutdown procedure.

To ensure a long life for your SD card and its contents, you should always turn off you Pi by selecting the shutdown option from the menu. This way the Pi saves any temporary files to the card before relinquishing power. Dramatic reconstruction By pulling the plug while your OS is still running, you might corrupt these files, which could result in the Pi failing to boot up again.

The only fix? Wipe the SD card clean and start over, waving goodbye to all files you didn’t back up. Passive projects But what if it’s not as easy as selecting shutdown, because your Raspberry Pi is embedded deep inside the belly of a project? Maybe you’ve hot-glued your into a pumpkin which is now screwed to the roof of your porch, or your store has a bank of Pi-powered monitors playing ads and the power is set to shut off every evening. Without the ability to shut down your Pi via the menu, you risk the SD card’s contents every time you power down your project. Read-only Just in time of the plethora of Halloween projects we’re looking forward to this month, the clever folk at Adafruit have designed a solution for this issue.

They’ve shared a script which forces the Raspberry Pi to run in read-only mode, so that powering it down via a plug pull will not corrupt the SD card. The script makes the Pi save temporary files to the RAM instead of the SD card. Of course, this means that no files or new software can be written to the card. However, if that’s not necessary for your Pi project, you might be happy to make the trade-off.

Note that you can only use Adafruit’s script on. Find more about the read-only Raspberry Pi solution, including the script and optional GPIO-halt utility,. And be aware that making your Pi read-only is irreversible, so be sure to back up the contents of your SD card before you implement the script. It’s October, and we’re now allowed to get excited about Halloween and all of the wonderful projects you plan on making for the big night. Post Syndicated from original If you’re attending this year’s in Orlando, AWS is to join us for a free evening of learning and networking. This AWS Security Jam will feature an opportunity to learn more about the AWS Security team (and about AWS security), socialize with peers, and engage in a night of trivia with your fellow conference friends. We will provide light appetizers and drinks.

Day: Wednesday, October 4, 2017. Time: 5:30–8:00 P.M. Eastern Time. Location: Rosen Centre Hotel Executive Ballroom, 9840 International Drive, Orlando, FL 32819 (next to the Orange County Convention Center) The first 150 attendees will win a door prize, and we will give additional prizes as part of a raffle at the end of the event. Follow us on Twitter for more information and updates about all things AWS Security and Compliance. Post Syndicated from original Uh, hey. Been a while.

My computer died? Linux abruptly put the primary hard drive in read-only mode, which seemed Really Bad, but then it refused to boot up entirely. I suspect the motherboard was on its last legs (though the drive itself was getting pretty worn out too), so long story short, I lost a week to ordering/building an entirely new machine and rearranging/zeroing hard drives.

The old one was six years old, so it was about time anyway. I also had some internet stuff to deal with, so overall I’ve had a rollercoaster of a week. Oh, and now my keyboard is finally starting to break. fox flux: I’m at the point where the protagonists are almost all done and I’ve started touching up particular poses (times ten).

So that’s cool. If I hadn’t lost the last week I might’ve been done with it by now!. devops: Well, there was that whole computer thing. Also I suddenly have support for colored fonts (read: emoji) in all GTK apps (except Chromium), and that led me to spend at least half a day trying to find a way to get Twemoji into a font using Google’s font extensions. Alas, no dice, so I’m currently stuck with a fairly outdated copy of the Android emoji, which I don’t want to upgrade because Google makes them worse with every revision. blog: I started on a post. I didn’t get very far.

I still owe two for September. Oops. book: Did some editing, worked on some illustrations. I figured out how to get math sections to (mostly) use the same font as body text, so inline math doesn’t look quite so comically out of place any more. cc: Fixed some stuff I broke, as usual, and worked some more on a Unity GUI for defining and editing sprite animations. I’m now way behind and have completely lost all my trains of thought, though I guess having my computer break is a pretty good excuse. Trying to get back up to speed as quickly as possible.

Oh, and happy October. 🎃. Post Syndicated from original As consumers continue to demand faster, simpler, and more on-the-go services, FinTech companies are responding with to fit everyone’s needs and to improve customer experience.

This month, we are excited to feature the following startups—all of whom are disrupting traditional financial services in unique ways:. Acorns – allowing customers to invest spare change automatically.

Bondlinc – improving the bond trading experience for clients, financial institutions, and private banks. Lenda – reimagining homeownership with a secure and streamlined online service. Acorns (Irvine, CA) Driven by the belief that anyone can grow wealth, is relentlessly pursuing ways to help make that happen. Currently the fastest-growing micro-investing app in the U.S., Acorns takes mere minutes to get started and is currently helping over 2.2 million people grow their wealth. And unlike other FinTech apps, Acorns is focused on helping America’s middle class – namely the 182 million citizens who make less than $100,000 per year – and looking after their financial best interests. Acorns is able to help their customers effortlessly invest their money, little by little, by offering put together by Dr.

Harry Markowitz, a Nobel Laureate in economic sciences. They also offer a range of services, including “Round-Ups,” whereby customers can automatically invest spare change from every day purchases, and “Recurring Investments,” through which customers can set up automatic transfers of just $5 per week into their portfolio. Additionally, Acorns’ earning platform, can help anyone spend smarter as the company connects customers to brands like, and, who then automatically invest in customers’ Acorns account. The Acorns platform runs entirely on AWS, allowing them to deliver a secure and scalable cloud-based experience.

By utilizing AWS, Acorns is able to offer an exceptional customer experience and fulfill its core mission. Acorns uses to manage services such as,. They also use and for data storage, and to manage document retention. Acorns is hiring! Be sure to check out their if you are interested. Bondlinc (Singapore), Founder and CEO of, has long wanted to standardize, improve, and automate the traditional workflows that revolve around bond trading.

As a former trader at BNP Paribas and Jefferies & Company, E.K. – as Keong is known – had personally seen how manual processes led to information bottlenecks in over-the-counter practices. This drove him, along with future Bondlinc CTO, to start a new service that maximizes efficiency, information distribution, and accessibility for both clients and bankers in the bond market. Currently, bond trading requires banks to spend a significant amount of resources retrieving data from expensive and restricted institutional sources, performing suitability checks, and attaching required documentation before presenting all relevant information to clients – usually by email.

Bankers are often overwhelmed by these time-consuming tasks, which means clients don’t always get proper access to time-sensitive bond information and pricing. Bondlinc bridges this gap between banks and clients by providing a variety of solutions, including easy access to basic bond information and analytics, updates of new issues and relevant news, consolidated management of your portfolio, and a chat function between banker and client. By making the bond market much more accessible to clients, Bondlinc is taking private banking to the next level, while improving efficiency of the banks as well. As a startup running on AWS since inception, Bondlinc has built and operated its SaaS product by leveraging, and across multiple Availability Zones to provide its customers (namely, financial institutions) a highly available and seamlessly scalable product distribution platform.

Bondlinc also makes extensive use of, and to meet the stringent operational monitoring, auditing, compliance, and governance requirements of its customers. Bondlinc is currently experimenting with to build a conversational interface into its mobile application via a chat-bot that provides trading assistance services. To see how Bondlinc works, request a demo at. Lenda (San Francisco, CA) is a digital mortgage company founded by seasoned FinTech entrepreneur. Jason wanted to create a smarter, simpler, and more streamlined system for people to either or their homes.

With Lenda, customers can find out if they are pre-approved for loans, and receive accurate, real-time mortgage rate quotes from industry-experienced home loan advisors. Lenda’s advisors support customers through the loan process by providing financial advice and guidance for a seamless experience. Lenda’s innovative platform allows borrowers to complete their home loans online from start to finish.

Through a savvy combination of being a direct lender with proprietary technology, Lenda has simplified the mortgage application process to save customers time and money. With an interactive dashboard, customers know exactly where they are in the mortgage process and can manage all of their documents in one place. The company recently received its Series A funding of $5.25 million, and van den Brand shared that most of the capital investment will be used to improve Lenda’s technology and fulfill the company’s mission, which is to reimagine homeownership, starting with home loans. AWS allows Lenda to scale its business while providing a secure, easy-to-use system for a faster home loan approval process. Currently, Lenda uses,.

Visit to find out more. — Thanks for reading and see you in October for another round of hot startups! Post Syndicated from original It’s no secret that we love music projects at Pi Towers. On the contrary, we often shout it from the rooftops like we’re! But the PianoAI project by left us slack-jawed: he built an AI on a Raspberry Pi that listens to his piano playing, and then produces improvised, real-time accompaniment. Another example of a short teaching and then jamming with piano with a version I’m more happy with.

I have to play for the Pi for a little while before the Pi has enough data to make its own music. The PianoAI Inspired by a, Zack set out to create an AI able to imitate his piano-playing style in real time. He began programming the AI in Python, before starting over in the open-source. Some of Zack’s notes for his AI If you just want to try out PianoAI, head over to. He provides a detailed guide that talks you through how to implement and use it.

Music to our ears The Raspberry Pi community never fails to amaze us with their wonderful builds, not least when it comes to musical ones. Check out this cool-looking, this by David Sharples, and this by Dmitry Morozov. Aren’t they all splendid? And the list goes Which instrument do you play? The recorder? The jaw harp? Could you create an AI like Zack’s for it?

Let us know in the comments below, and share your builds with us. The post appeared first on.

As we were getting ready to publish this post, we received news from, one of the biggest WordPress plugin developers, that they are supporting Backblaze B2 as a storage solution for their backup plugin. They shipped the update (1.13.9) this week. This is great news for Backblaze customers!

UpdraftPlus is also offering a 20% discount to Backblaze customers wishing to purchase or upgrade to UpdraftPlus Premium. The complete information is. UpdraftPlus joins backup plugin developer in supporting Backblaze B2. A third developer, also announced their intent to support Backblaze B2. And urge them to support Backblaze B2, as well. Now, back to our post Your WordPress website data is on a web server that’s most likely located in a large data center.

You might wonder why it is necessary to have a backup of your website if it’s in a data center. Website data can be lost in a number of ways, including mistakes by the website owner (been there), hacking, or even domain ownership dispute (I’ve seen it happen more than once). A website backup also can provide a history of changes you’ve made to the website, which can be useful. As an overall strategy, it’s best to have a backup of any data that you can’t afford to lose for personal or business reasons. Your web hosting company might provide backup services as part of your hosting plan. If you are using their service, you should know where and how often your data is being backed up. You don’t want to find out too late that your backup plan was not adequate.

Sites on are by (Automattic), which also is available for self-hosted WordPress installations. If you don’t want the work or decisions involved in managing the hosting for your WordPress site, WordPress.com will handle it for you. You do, however, give up some customization abilities, such as the option to add plugins of your own choice. Very large and active websites might consider by Automattic, or another premium WordPress hosting service such as.

This post is about backing up self-hosted WordPress sites, so we’ll focus on those options. WordPress Backup Backup strategies for WordPress can be divided into broad categories depending on 1) what you back up, 2) when you back up, and 3) where the data is backed up. With server data, such as with a WordPress installation, you should plan to have three copies of the data (the ). The first is the active data on the WordPress web server, the second is a backup stored on the web server or downloaded to your local computer, and the third should be in another location, such as the cloud.

We’ll talk about the different approaches to backing up WordPress, but we recommend using a WordPress plugin to handle your backups. A backup plugin can automate the task, optimize your backup storage space, and alert you of problems with your backups or WordPress itself. We’ll cover plugins in more detail, below. What to Back Up? The main components of your WordPress installation are:. WordPress plugins.

WordPress themes. User-created media and files. PHP, JavaScript, and other code files. Other support files You should decide. The database is the top priority, as it contains all your website posts and pages (exclusive of media). Your current theme is important, as it likely contains customizations you’ve made. Following those in priority are any other files you’ve customized or made changes to.

You can choose to back up the WordPress core installation and plugins, if you wish, but these files can be downloaded again if necessary from the source, so you might not wish to include them. You likely have all the media files you use on your website on your local computer (which should be backed up), so it is your choice whether to back these up from the server as well. If you wish to be able to recreate your entire website easily in case of data loss or disaster, you might choose to back up everything, though on a large website this could be a lot of data. Generally, you should 1) prioritize any file that you’ve customized that you can’t afford to lose, and 2) decide whether you need a copy of everything in order to get your site back up quickly. These choices will determine your backup method and the amount of storage you need.

A good backup plugin for WordPress enables you to specify which files you wish to back up, and even to create separate backups and schedules for different backup contents. That’s another good reason to use a plugin for backing up WordPress. When to Back Up? You can back up manually at any time by using the Export tool in WordPress. This is handy if you wish to do a quick backup of your site or parts of it.

Since it is manual, however, it is not a part of a dependable backup plan that should be done regularly. If you wish to use this tool, go to Tools, Export, and select what you wish to back up.

The output will be an XML file that uses the WordPress Extended RSS format, also known as WXR. You can create a WXR file that contains all of the information on your site or just portions of the site, such as posts or pages by selecting: All content, Posts, Pages, or Media. Note: You can use WordPress’s Export tool for sites hosted on WordPress.com, as well. Many of the backup plugins we’ll be discussing later also let you do a manual backup on demand in addition to regularly scheduled or continuous backups. Note: Another use of the WordPress Export tool and the WXR file is to transfer or clone your website to another server. Once you have exported the WXR file from the website you wish to transfer from, you can import the WXR file from the Tools, Import menu on the new WordPress destination site.

Be aware that there are file size limits depending on the settings on your web server. See the for more information. To make this job easier, you may wish to use one of a number of WordPress plugins designed specifically for this task. You also can manually back up the WordPress MySQL database using a number of tools or a plugin.

The has good information on this. All WordPress plugins will handle this for you and do it automatically.

They also typically include tools for optimizing the database tables, which is just good housekeeping. A dependable backup strategy doesn’t rely on manual backups, which means you should consider using one of the many backup plugins available either free or for purchase. We’ll talk more about them below. Which Format To Back Up In? In addition to the WordPress WXR format, plugins and server tools will use various file formats and compression algorithms to store and compress your backup. You may get to choose between zip, tar, tar.gz, tar.gz2, and others. See for more information on these formats.

Select a format that you know you can access and unarchive should you need access to your backup. All of these formats are standard and supported across operating systems, though you might need to download a utility to access the file. Where To Back Up?

Once you have your data in a suitable format for backup, where do you back it up to? We want to have multiple copies of our active website data, so we’ll choose more than one destination for our backup data. The backup plugins we’ll discuss below enable you to specify one or more possible destinations for your backup. The possible destinations for your backup include: A backup folder on your web server A backup folder on your web server is an OK solution if you also have a copy elsewhere. Depending on your hosting plan, the size of your site, and what you include in the backup, you may or may not have sufficient disk space on the web server. Some backup plugins allow you to configure the plugin to keep only a certain number of recent backups and delete older ones, saving you disk space on the server.

Email to you Because email servers have size limitations, the email option is not the best one to use unless you use it to specifically back up just the database or your main theme files. FTP, SFTP, SCP, WebDAV FTP, SFTP, SCP, and WebDAV are all widely-supported protocols for transferring files over the internet and can be used if you have access credentials to another server or supported storage device that is suitable for storing a backup. Sync service (Dropbox, SugarSync, Google Drive, OneDrive) A sync service is another possible server storage location though it can be a pricier choice depending on the plan you have and how much you wish to store. Cloud storage (Backblaze B2, Amazon S3, Google Cloud, Microsoft Azure, Rackspace) A cloud storage service can be an inexpensive and flexible option with pay-as-you go pricing for storing backups and other data. A good website backup strategy would be to have multiple backups of your website data: one in a backup folder on your web hosting server, one downloaded to your local computer, and one in the cloud, such as with Backblaze B2. If I had to choose just one of these, I would choose backing up to the cloud because it is geographically separated from both your local computer and your web host, it uses fault-tolerant and redundant data storage technologies to protect your data, and it is available from anywhere if you need to restore your site. Backup Plugins for WordPress Probably the easiest and most common way to implement a solid backup strategy for WordPress is to use one of the many.

Fortunately, there are a number of good ones and are available free or in “freemium” plans in which you can use the free version and pay for more features and capabilities only if you need them. The premium options can give you more flexibility in configuring backups or have additional options for where you can store the backups. How to Choose a WordPress Backup Plugin When considering which plugin to use, you should take into account a number of factors in making your choice.

Docker Timing Out Issue 182 Docker/for-mac Github Download

Is the plugin actively maintained and up-to-date? You can determine this from the listing in the. You also can look at reviews and support comments to get an idea of user satisfaction and how well issues are resolved. Does the plugin work with your web hosting provider?

Generally, well-supported plugins do, but you might want to check to make sure there are no issues with your hosting provider. Does it support the cloud service or protocol you wish to use? This can be determined from looking at the listing in the WordPress Plugin Repository or on the developer’s website. Developers often will add support for cloud services or other backup destinations based on user demand, so let the developer know if there is a feature or backup destination you’d like them to add to their plugin. Other features and options to consider in choosing a backup plugin are:. Whether encryption of your backup data is available.

Docker Timing Out Issue 182 Docker/for-mac Github Pdf

What are the options for automatically deleting backups from the storage destination?. Can you globally exclude files, folders, and specific types of files from the backup?. Do the options for scheduling automatic backups meet your needs for frequency?.

Can you exclude/include specific database tables (a good way to save space in your backup)? WordPress Backup Plugins Review Let’s review a few of the top choices for WordPress backup plugins. UpdraftPlus is one of the most popular backup plugins for WordPress with over one million active installations. It is available in both free and Premium versions. UpdraftPlus just released support for Backblaze B2 Cloud Storage in their 1.13.9 update on September 25. According to the developer, support for Backblaze B2 was the most frequent request for a new storage option for their plugin.

B2 support is available in their Premium plugin and as a to their standard product. Note: The developers of UpdraftPlus are offering a special 20% discount to Backblaze customers on the purchase of by using the coupon code backblaze20. The discount is valid until the end of Friday, October 6th, 2017. XCloner — Backup and Restore is a useful open-source plugin with many options for backing up WordPress.

At Gutsy Tutoring we understand the importance of being well prepared for your examination periods. We have compiled past exam papers and memos for all subjects, from 2010 to 2014, to help you achieve the results you want when writing this year. Finding a good quality source of past exam papers for your prep work can be tough and very time consuming. Matric past papers for mac. You need to focus on your work, not spend ages trying to find it!

XCloner supports B2 Cloud Storage in their free plugin. BlogVault describes themselves as a “complete WordPress backup solution.” They offer a free trial of their paid WordPress backup subscription service that features real-time backups of changes to your WordPress site, as well as many other features. BlogVault has announced their intent to support Backblaze B2 Cloud Storage in a future update. BackWPup is a popular and free option for backing up WordPress. It supports a number of options for storing your backup, including the cloud, FTP, email, or on your local computer. WPBackItUp has been around since 2012 and is highly rated. It has both free and paid versions.

VaultPress is part of Automattic’s well-known WordPress product, JetPack. You will need a JetPack subscription plan to use VaultPress. There are different pricing plans with different sets of features. Backup by Supsystic supports a number of options for backup destinations, encryption, and scheduling. BackupWordPress is an open-source project on Github that has a popular and active following and many positive reviews. BackupBuddy, from, is the old-timer of backup plugins, having been around since 2010. IThemes knows a lot about WordPress, as they develop plugins, themes, utilities, and provide training in WordPress.

BackupBuddy’s backup includes all WordPress files, all files in the WordPress Media library, WordPress themes, and plugins. BackupBuddy generates a downloadable zip file of the entire WordPress website. Remote storage destinations also are supported. WordPress and the Cloud Do you use WordPress and back up to the cloud? We’d like to hear about it. We’d also like to hear whether you are interested in using B2 Cloud Storage for storing media files served by WordPress.

If you are, we’ll write about it in a future post. In the meantime, keep your eye out for new plugins supporting Backblaze B2, or better yet, urge them to support B2 if they’re not already.

The Best Backup Strategy is the One You Use There are other approaches and tools for backing up WordPress that you might use. If you have an approach that works for you, we’d love to hear about it in the comments. The post appeared first on. Post Syndicated from original The All Systems Go! 2017 schedule has been published! I am happy to announce that we have published the schedule! We are very happy with the large number and the quality of the submissions we got, and the resulting schedule is exceptionally strong.

Without further ado: Here are a couple of keywords from the topics of the talks: 1password, azure, bluetooth, build systems, casync, cgroups, cilium, cockpit, containers, ebpf, flatpak, habitat, IoT, kubernetes, landlock, meson, OCI, rkt, rust, secureboot, skydive, systemd, testing, tor, varlink, virtualization, wifi, and more. Our speakers are from all across the industry: Chef CoreOS, Covalent, Facebook, Google, Intel, Kinvolk, Microsoft, Mozilla, Pantheon, Pengutronix, Red Hat, SUSE and more. For further information about All Systems Go! Make sure to buy your ticket for All Systems Go! A limited number of tickets are left at this point, so make sure you get yours before we are all sold out! See you in Berlin!

Docker Timing Out Issue 182 Docker/for-mac Github.com

Posts navigation.

Posted on  by  admin