first_imgLast week, I reflected on Earth Day and how concern for the environment inspired me in school and then [no-glossary]led[/no-glossary] to my focus on renewable energy starting in the mid-1970s. This brought me to Brattleboro in 1980 to work for the Northeast Sustainable Energy Association, which I did from 1980 through 1985.To continue:In 1985 I was ready to leave the nonprofit world and see if I could make it focusing on writing as a career — with a continued focus on the environment. I had been writing for a few publications during my stint with the New Mexico Solar Energy Association and NESEA, most notably a monthly column on energy for the Journal of Light Construction, but I didn’t know if I could make a livelihood out of that. Teaming up with NadavOne of my major writing projects during this time was a guide to energy-efficient construction for the Energy Crafted Home Program, a utility-funded initiative in Massachusetts. As work loads increased, I hired support staff to help with specific projects. One of those hires, very significantly, was Nadav Malin, in 1991.As my freelance writing career grew, more and more of my assignments were on mainstream building practices and were driven by magazine advertisers: “Alex, we need an article on ‘exterior insulation and finish systems’ and, by the way, it should mention these four companies…” Whenever I got a chance I would write about the intersections of building practices and the environment — whether relating to ozone depletion, global warming, renewable energy, indoor air quality, or water conservation. But these opportunities weren’t as frequent as I wanted. EBN is bornIn the spring of that year, we sent a letter to a couple thousand members of NESEA announcing a new publication, Environmental Building News, and inviting them to subscribe at a special charter subscriber rate. We didn’t really know anything about publishing or direct-mail. We figured if there was interest, recipients of the letter would send in checks. If not, we wouldn’t have invested too much in the experiment. We could cut our losses and move on to the next contact-writing project.But, lo and behold, we had an amazing 14% response to that mailing! (A rate just one-tenth that would be remarkable for any direct-mail campaign today.) Checks flooded in, and we printed our first issue in July, 1992. Remembering my frustrations with advertiser influence over editorial content in other magazines and knowing that we wanted to be able to say whatever we wanted about particular products and technologies, Nadav and I opted not to carry advertising.Environmental Building News (EBN) grew, filling a need out there for the emerging green building community, and we soon had subscribers in all fifty states and in more than a dozen foreign countries. At the time, our business was called West River Communications, but when we launched our first website (in 1995, I believe) we changed our company name to E Build, Inc., to mirror the name of our website. (Later, we would sell and use the proceeds to put our green products database online).Since launching EBN, we gradually grew the company, renamed it BuildingGreen, Inc. (after selling our domain), and launched other resources relating to green building products and the LEED Rating System. It has been an exciting — even if scary — time to be in publishing. We were an early adopter of desktop publishing and very early to the game with the World Wide Web. We have also bucked publishing trends throughout our history by shirking advertising (the primary revenue for most publications) and charging for Web-delivered content.We have continued a mix of our own publishing and contract work, and we’ve been able to focus our contract work in ways that strengthened our in-house expertise in green building. Over the past two and a half decades, we’ve done work for the U.S. Department of Energy, Environmental Protection Agency, HUD, the U.S. General Services Administration, the American Institute of Architects, the U.S. Green Building Council, the Rocky Mountain Institute, and several national energy research laboratories. We even participated in the Greening of the White House project during the Clinton Administration. A few articles on green architectureIn 1990 or ’91 Architecture magazine, for which I was a contributing editor, decided to produce a special issue on “green architecture,” a relatively new concept. I wrote several of the articles for that issue, which was well received, even winning an award as I recall. I began to wonder if there might be a niche for a publication focused specifically on green design and construction.Nadav and I talked about this for a while, and in early 1992, we decided to give it a shot. If we succeeded, we could stabilize our revenue through subscriptions and be less dependent on the whims of other magazine editors and on contract work that took effort to drum up and could not always be counted on. Freelance writing is a tough row to hoeIndeed, when I started out on my own I worked two days a week for a local restoration builder. As my writing picked up I gradually shifted to writing full time. I was doing a mix of freelance writing for six or seven magazines, but learned pretty quickly that freelance writing is a tough row to hoe. I supplemented that writing with various technical writing projects for state energy offices, utility companies, nonprofit organizations, and a few manufacturers.An early project was writing a series of home energy improvement pamphlets for the Massachusetts Audubon Society, and this led to writing the Consumer Guide to Home Energy Savings for the American Council for an Energy Efficient Economy in 1989. That little book was very well received, ultimately selling several hundred thousand copies and opening the door to lots of other writing opportunities in the energy field. Partnering with Taunton Press to create GBAFor two years, during most of 2008 through early 2010, we were partnered with Taunton Press and during that time created GreenBuildingAdvisor. GBA is a tremendous resource, but it was launched just when the building industry collapsed, and in the hard realities of the weak building economy since, GBA shifted in ways that challenged with original partnership. Both BuildingGreen and Taunton agreed that parting ways made sense, and we separated very amicably two years ago. BuildingGreen became an independent company again, and Taunton Press took full ownership of GBA — though we continue some level of involvement (including this blog).As always during our two-and-a-half-decade history, BuildingGreen has remained true to our initial vision as a mission-driven company, focused on the environment. Our corporate mission statement reads, in part:“…to facilitate transformation of the North American building industry into a force for local, regional and global environmental protection; for preservation and restoration of the natural environment; and for creation of healthy indoor environments.”We are now a 20-person company serving builders, architects, researchers, educators, and policy makers nationwide and even internationally. We work collaboratively with many partners around the country. While two of our employees work remotely and come into the office only occasionally, most of us are located in one of the historic Estey Organ Buildings on Birge Street in Brattleboro. Nadav took over as president several years ago and is ably leading BuildingGreen into the future as we try to keep making a difference.Readers can learn more about BuildingGreen and our products on our website. Alex is founder of BuildingGreen, Inc. and executive editor of Environmental Building News. He coauthored the just-published BuildingGreen special report, Better Window Decisions, which provides clear guidance on window selection. To keep up with Alex’s latest articles and musings, you can sign up for his Twitter feed.last_img read more

first_imgThe annual High School Certificate Examination-2019, to be conducted by the State Board of Secondary Examination, will begin from February 22 next year. Board president Jahan Ara Begum on Friday said that the Matriculation examinations for all the streams will continue till March 8. Practical examinations will be held between February 11 and February 16, Ms. Begum said.She added that examinations equivalent to Matriculation in the State such as Madhyama Exams for Sanskrit students and State Open School Certificate Examinations (for drop-outs) will be held during the same period. Over six lakh students will appear for the HSC exams next year, sources said, adding that examinations would be held on the scheduled dates between 9 a.m. and 11.30 a.m.last_img read more

first_imgJanuary 22, 2015 2 min read This hands-on workshop will give you the tools to authentically connect with an increasingly skeptical online audience. Free Workshop | August 28: Get Better Engagement and Build Trust With Customers Nowcenter_img While online education has been gaining traction in America for roughly 15 years, the inevitable maturation and spread of this technology into developing countries is bound to spark a revolution.That was a key takeaway from a letter penned by the Bill and Melinda Gates Foundation — which, with an endowment of $42.3 billion, represents one of the largest private foundations on earth.On this, its 15th anniversary, the Foundation laid bare its hopes for the world over the next 15 years — including the prediction that online education will reach hundreds of millions of people across the globe.Related: Duolingo, the Chart-Topping Language App, Unveils a Platform for TeachersThe growth of high-speed cell networks and a proliferation of affordable devices will largely fuel this accessibility.Children who have grown up with smartphones and tablets, for instance, tend to utilize them intuitively. Therefore, according to the Foundation, kids in third world countries will eventually be able to learn letters and numbers before even entering primary school, aided by software that adjusts to various learning speeds.The Foundation also envisions online education that better feeds into specific career paths. Whereas early efforts in the field have “amounted to little more than pointing a camera at a university lecturer and hitting the ‘record,’ button,” according to the Gates letter, new coursework would ostensibly hone in on specific professional requirements.Related: The Latest – and Unlikeliest – Man to Reinvent Online EducationPerhaps most vital to the future of education, however — especially in developing countries — is closing the gender gap. One way this can be accomplished is by putting technology in the hands of women. In Africa and South Asia, for instance, women are far less likely than men to own a cell phone.While education can be a powerful force for equality, if such pain points aren’t addressed, writes the Foundation, “then education will become another cause of inequity, rather than a cure for it.”For more predictions about how the world might look in 2030, check out the Gates letter in full right here.Related: Bill Gates’ 5 Favorite Books of 2014 Enroll Now for Freelast_img read more

first_imgSince Rust 1.0 has a great macro system, it allows us to apply some code to multiple types or expressions, as they work by expanding themselves at compile time. This means that when you use a macro, you are effectively writing a lot of code before the actual compilation starts. This has two main benefits, first, the codebase can be easier to maintain by being smaller and reusing code. Second, since macros expand before starting the creation of object code, you can abstract at the syntactic level. In this article, we’ll learn how to create our very own macros in Rust. This Rust tutorial is an extract from Rust High Performance, authored by Iban Eguia Moraza. For example, you can have a function like this one: fn add_one(input: u32) -> u32 { input + 1} This function restricts the input to u32 types and the return type to u32. We could add some more accepted types by using generics, which may accept &u32 if we use the Add trait. Macros allow us to create this kind of code for any element that can be written to the left of the + sign and it will be expanded differently for each type of element, creating a different code for each case. To create a macro, you will need to use a macro built into the language, the macro_rules!{} macro. This macro receives the name of the new macro as a first parameter and a block with the macro code as a second element. The syntax can be a bit complex the first time you see it, but it can be learned quickly. Let’s start with a macro that does just the same as the function we saw before: macro_rules! add_one { ($input:expr) => { $input + 1 }} You can now call that macro from your main() function by calling add_one!(integer);. Note that the macro needs to be defined before the first call, even if it’s in the same file. It will work with any integer, which wasn’t possible with functions. Let’s analyze how the syntax works. In the block after the name of the new macro (add_one), we can see two sections. In the first part, on the left of the =>, we see $input:expr inside parentheses. Then, to the right, we see a Rust block where we do the actual addition. The left part works similarly (in some ways) to a pattern match. You can add any combination of characters and then some variables, all of them starting with a dollar sign ($) and showing the type of variable after a colon. In this case, the only variable is the $input variable and it’s an expression. This means that you can insert any kind of expression there and it will be written in the code to the right, substituting the variable with the expression. Creating Macro variants As you can see, it’s not as complicated as you might think. As I wrote, you can have almost any pattern to the left of the macro_rules!{} side. Not only that, you can also have multiple patterns, as if it were a match statement, so that if one of them matches, it will be the one expanded. Let’s see how this works by creating a macro which, depending on how we call it, will add one or two to the given integer: macro_rules! add { {one to $input:expr} => ($input + 1); {two to $input:expr} => ($input + 2);} fn main() {println!(“Add one: {}”, add!(one to 25/5));println!(“Add two: {}”, add!(two to 25/5));} You can see a couple of clear changes to the macro. First, we swapped braces for parentheses and parentheses for braces in the macro. This is because in a macro, you can use interchangeable braces ({ and }), square brackets ([ and ]), and parentheses (( and )). Not only that, you can use them when calling the macro. You have probably already used the vec![] macro and the format!() macro, and we saw the lazy_static!{} macro in the last chapter. We use brackets and parentheses here just for convention, but we could call the vec!{} or the format![] macros the same way, because we can use braces, brackets, and parentheses in any macro call. The second change was to add some extra text to our left-hand side patterns. We now call our macro by writing the text one to or two to, so I also removed the one redundancy to the macro name and called it add!(). This means that we now call our macro with literal text. That is not valid Rust, but since we are using a macro, we modify the code we are writing before the compiler tries to understand actual Rust code and the generated code is valid. We could add any text that does not end the pattern (such as parentheses or braces) to the pattern. The final change was to add a second possible pattern. We can now add one or two and the only difference will be that the right side of the macro definition must now end with a trailing semicolon for each pattern (the last one is optional) to separate each of the options. A small detail that I also added in the example was when calling the macro in the main() function. As you can see, I could have added one or two to 5, but I wrote 25/5 for a reason. When compiling this code, this will be expanded to 25/5 + 1 (or 2, if you use the second variant). This will later be optimized at compile time, since it will know that 25/5 + 1 is 6, but the compiler will receive that expression, not the final result. The macro system will not calculate the result of the expression; it will simply copy in the resulting code whatever you give to it and then pass it to the next compiler phase. You should be especially careful with this when a macro you are creating calls another macro. They will get expanded recursively, one inside the other, so the compiler will receive a bunch of final Rust code that will need to be optimized. Issues related to this were found in the CLAP crate that we saw in the last chapter, since the exponential expansions were adding a lot of bloat code to their executables. Once they found out that there were too many macro expansions inside the other macros and fixed it, they reduced the size of their binary contributions by more than 50%. Macros allow for an extra layer of customization. You can repeat arguments more than once. This is common, for example, in the vec![] macro, where you create a new vector with information at compile time. You can write something like vec![3, 4, 76, 87];. How does the vec![] macro handle an unspecified number of arguments? Creating Complex macros We can specify that we want multiple expressions in the left-hand side pattern of the macro definition by adding a * for zero or more matches or a + for one or more matches. Let’s see how we can do that with a simplified my_vec![] macro: macro_rules! my_vec { ($($x: expr),*) => {{ let mut vector = Vec::new(); $(vector.push($x);)* vector }}} Let’s see what is happening here. First, we see that on the left side, we have two variables, denoted by the two $ signs. The first makes reference to the actual repetition. Each comma-separated expression will generate a $x variable. Then, on the right side, we use the various repetitions to push $x to the vector once for every expression we receive. There is another new thing on the right-hand side. As you can see, the macro expansion starts and ends with a double brace instead of using only one. This is because, once the macro gets expanded, it will substitute the given expression for a new expression: the one that gets generated. Since what we want is to return the vector we are creating, we need a new scope where the last sentence will be the value of the scope once it gets executed. You will be able to see it more clearly in the next code snippet. We can call this code with the main() function: fn main() { let my_vector = my_vec![4, 8, 15, 16, 23, 42]; println!(“Vector test: {:?}”, my_vector);} It will be expanded to this code: fn main() { let my_vector = { let mut vector = Vec::new(); vector.push(4); vector.push(8); vector.push(15); vector.push(16); vector.push(23); vector.push(42); vector }; println!(“Vector test: {:?}”, my_vector);} As you can see, we need those extra braces to create the scope that will return the vector so that it gets assigned to the my_vector binding. You can have multiple repetition patterns on the left expression and they will be repeated for every use, as needed on the right. macro_rules! add_to_vec { ($( $x:expr; [ $( $y:expr ),* ]);* ) => { &[ $($( $x + $y ),*),* ] }} In this example, the macro can receive one or more $x; [$y1, $y2,…] input. So, for each input, it will have one expression, then a semicolon, then a bracket with multiple sub-expressions separated by a comma, and finally, another bracket and a semicolon. But what does the macro do with this input? Let’s check to the right-hand side of it. As you can see, this will create multiple repetitions. We can see that it creates a slice (&[T]) of whatever we feed to it, so all the expressions we use must be of the same type. Then, it will start iterating over all $x variables, one per input group. So if we feed it only one input, it will iterate once for the expression to the left of the semicolon. Then, it will iterate once for every $y expression associated with the $x expression, add them to the + operator, and include the result in the slice. If this was too complex to understand, let’s look at an example. Let’s suppose we call the macro with 65; [22, 34] as input. In this case, 65 will be $x, and 22, 24, and so on will be $y variables associated with 65. So, the result will be a slice like this: &[65+22, 65+34]. Or, if we calculate the results: &[87, 99]. If, on the other hand, we give two groups of variables by using 65; [22, 34]; 23; [56, 35] as input, in the first iteration, $x will be 65, while in the second one, it will be 23. The $y variables of 64 will be 22 and 34, as before, and the ones associated with 23 will be 56 and 35. This means that the final slice will be &[87, 99, 79, 58], where 87 and 99 work the same way as before and 79 and 58 are the extension of adding 23 to 56 and 23 to 35. This gives you much more flexibility than the functions, but remember, all this will be expanded during compile time, which can make your compilation time much slower and the final codebase larger and slower still if the macro used duplicates too much code. In any case, there is more flexibility to it yet. So far, all variables have been of the expr kind. We have used this by declaring $x:expr and $y:expr but, as you can imagine, there are other kinds of macro variables. The list follows: expr: Expressions that you can write after an = sign, such as 76+4 or if a==1 {“something”} else {“other thing”}. ident: An identifier or binding name, such as foo or bar. path: A qualified path. This will be a path that you could write in a use sentence, such as foo::bar::MyStruct or foo::bar::my_func. ty: A type, such as u64 or MyStruct. It can also be a path to the type. pat: A pattern that you can write at the left side of an = sign or in a match expression, such as Some(t) or (a, b, _). stmt: A full statement, such as a let binding like let a = 43;. block: A block element that can have multiple statements and a possible expression between braces, such as {vec.push(33); vec.len()}. item: What Rust calls items. For example, function or type declarations, complete modules, or trait definitions. meta: A meta element, which you can write inside of an attribute (#[]). For example, cfg(feature = “foo”). tt: Any token tree that will eventually get parsed by a macro pattern, which means almost anything. This is useful for creating recursive macros, for example. As you can imagine, some of these kinds of macro variables overlap and some of them are just more specific than the others. The use will be verified on the right-hand side of the macro, in the expansion, since you might try to use a statement where an expression must be used, even though you might use an identifier too, for example. There are some extra rules, too, as we can see in the Rust documentation ( Statements and expressions can only be followed by =>, a comma, or a semicolon. Types and paths can only be followed by =>, the as or where keywords, or any commas, =, |, ;, :, >, [, or {. And finally, patterns can only be followed by =>, the if or in keywords, or any commas, =, or |. Let’s put this in practice by implementing a small Mul trait for a currency type we can create. This is an adapted example of some work we did when creating the Fractal Credits digital currency. In this case, we will look to the implementation of the Amount type (, which represents a currency amount. Let’s start with the basic type definition: #[derive(Copy, Clone, PartialEq, Eq, PartialOrd, Ord)]pub struct Amount { value: u64,} This amount will be divisible by up to three decimals, but it will always be an exact value. We should be able to add an Amount to the current Amount, or to subtract it. I will not explain these trivial implementations, but there is one implementation where macros can be of great help. We should be able to multiply the amount by any positive integer, so we should implement the Mul trait for u8, u16, u32, and u64 types. Not only that, we should be able to implement the Div and the Rem traits, but I will leave those out, since they are a little bit more complex. You can check them in the implementation linked earlier. The only thing the multiplication of an Amount with an integer should do is to multiply the value by the integer given. Let’s see a simple implementation for u8: use std::ops::Mul;impl Mul for Amount {type Output = Self;fn mul(self, rhs: u8) -> Self::Output {Self { value: self.value * rhs as u64 }}} impl Mul for u8 {type Output = Amount;fn mul(self, rhs: Amount) -> Self::Output {Self::Output { value: self as u64 * rhs.value }}} As you can see, I implemented it both ways so that you can put the Amount to the left and to the right of the multiplication. If we had to do this for all integers, it would be a big waste of time and code. And if we had to modify one of the implementations (especially for Rem functions), it would be troublesome to do it in multiple code points. Let’s use macros to help us. We can define a macro, impl_mul_int!{}, which will receive a list of integer types and then implement the Mul trait back and forward between all of them and the Amount type. Let’s see: macro_rules! impl_mul_int { ($($t:ty)*) => ($( impl Mul for Amount { type Output = Self; fn mul(self, rhs: $t) -> Self::Output {Self { value: self.value * rhs as u64 }}}impl Mul for $t {type Output = Amount; fn mul(self, rhs: Amount) -> Self::Output {Self::Output { value: self as u64 * rhs.value }}})*)}impl_mul_int! { u8 u16 u32 u64 usize } As you can see, we specifically ask for the given elements to be types and then we implement the trait for all of them. So, for any code that you want to implement for multiple types, you might as well try this approach, since it will save you from writing a lot of code and it will make it more maintainable. If you found this article useful and would like to learn more such tips, head on over to pick up the book, Rust High Performance, authored by Iban Eguia Moraza. Read Next: Perform Advanced Programming with Rust Rust 1.28 is here with global allocators, nonZero types and more Eclipse IDE’s Photon release will support Rustlast_img read more

first_imgYesterday, James Bennett, a software developer and an active contributor to the Django web framework issued the summary of a proposal on dissolving the Django Core team and revoking commit bits. Re-forming or reorganizing the Django core team has been a topic of discussion from the last couple of years, and this proposal aims to take this discussion to real action. What are the reasons behind the proposal of dissolving the Django Core team? Unable to bring in new contributors Django, the open source project has been facing some difficulty in recruiting and retaining contributors to keep the project alive. Typically, open source projects avoid this situation by having corporate sponsorship of contributions. Companies which rely on the software also have employees who are responsible to maintain it. This was true in the case of Django as well but it hasn’t really worked out as a long-term plan. As compared to the growth of this web framework, it has hardly been able to draw contributors from across its entire user base. The project has not been able to bring new committers at a sufficient rate to replace those who have become less active or even completely inactive. This essentially means that Django is dependent on the goodwill of the contributors who mostly don’t get paid to work on it and are very few in number. This poses a risk on the future of the Django web framework. Django Committer is seen as a high-prestige title Currently, the decisions are made by consensus, involving input from committers and non-committers on the django-developers list and the commits to the main Django repository are made by the Django Fellows. Even people who have commit bits of their own, and therefore have the right to just push their changes straight into Django, typically use pull requests and start a discussion. The actual governance rarely relies on the committers, but still, Django committer is seen as a high-prestige title, and committers are given a lot of respect by the wider community. This creates an impression among potential contributors that they’re not “good enough” to match up to those “awe-inspiring titanic beings”. What is this proposal about? Given the reasons above, this proposal is being made to dissolve the Django core team and also revoke the commit bits. Instead, this proposal will introduce two roles called Mergers and Releasers. Mergers would merge pull requests into Django and Releasers would package/publish releases. Rather than being all-powered decision-makers, these would be bureaucratic roles. The current set of Fellows will act as the initial set of Mergers, and something similar will happen for Releasers. As opposed to allowing the committers making decisions, governance would take place entirely in public, on the django-developers mailing list. But as a final tie-breaker, the technical board would be retained and would get some extra decision-making power. These powers will be mostly related to the selection of the Merger/Releaser roles and confirming that new versions of Django are ready for release. The technical board will be elected very less often than it currently is and the voting would also be open to public. The Django Software Foundation (DSF) will act as a neutral  administrator of the technical board elections. What are the goals this proposal aims to achieve? Mr. Bennett believes that eliminating the distinction between the committers and the “ordinary contributors” will open doors for more contributors: “Removing the distinction between godlike “committers” and plebeian ordinary contributors will, I hope, help to make the project feel more open to contributions from anyone, especially by making the act of committing code to Django into a bureaucratic task, and making all voices equal on the django-developers mailing list.” The technical board remains as a backstop for resolving dead-locked decisions. This proposal will provide additional authority to the board such as issuing the final go-ahead on releases. Retaining the technical board will ensure that Django is not going to descend into some sort of “chaotic mob rule”. Also, with this proposal the formal description of Django’s governance becomes much more in line with the reality of how the project actually works and has worked for the past several years. To know more in detail, read the post by James Bannett: Django Core no more. Read Next Django 2.1.2 fixes major security flaw that reveals password hash to “view only” admin users Django 2.1 released with new model view permission and more Getting started with Django and Django REST frameworks to build a RESTful applast_img read more

first_imgToday, TigerGraph, the world’s fastest graph analytics platform for the enterprise, introduced TigerGraph Cloud, the simplest, most robust and cost-effective way to run scalable graph analytics in the cloud. With TigerGraph Cloud, users can easily get their TigerGraph services up and running. They can also tap into TigerGraph’s library of customizable graph algorithms to support key use cases including AI and Machine Learning. It provides data scientists, business analysts, and developers with the ideal cloud-based service for applying SQL-like queries for faster and deeper insights into data. It also enables organizations to tap into the power of graph analytics within hours. Features of TigerGraph Cloud Simplicity It forgoes the need to set up, configure or manage servers, schedule backups or monitoring, or look for security vulnerabilities. Robustness TigerGraph relies on the same framework providing point-in-time recovery, powerful configuration options, and stability that has been used for its own workloads over several years. Application Starter Kits It offers out-of-the-box starter kits for quicker application development for cases such as Anti-Fraud, Anti-Money Laundering (AML), Customer 360, Enterprise Graph analytics and more. These starter kits include graph schemas, sample data, preloaded queries and a library of customizable graph algorithms (PageRank, Shortest Path, Community Detection, and others). TigerGraph makes it easy for organizations to tailor such algorithms for their own use cases. Flexibility and elastic pricing Users pay for exactly the hours they use and are billed on a monthly basis. Spin up a cluster for a few hours for minimal cost, or run larger, mission-critical workloads with predictable pricing. This new cloud offering will also be available for production on AWS, with other cloud availability forthcoming. Yu Xu, founder and CEO, TigerGraph, said, “TigerGraph Cloud addresses these needs, and enables anyone and everyone to take advantage of scalable graph analytics without cloud vendor lock-in. Organizations can tap into graph analytics to power explainable AI – AI whose actions can be easily understood by humans – a must-have in regulated industries. TigerGraph Cloud further provides users with access to our robust graph algorithm library to support PageRank, Community Detection and other queries for massive business advantage.” Philip Howard, research director, Bloor Research, said, “What is interesting about TigerGraph Cloud is not just that it provides scalable graph analytics, but that it does so without cloud vendor lock-in, enabling companies to start immediately on their graph analytics journey.” According to TigerGraph, “Compared to TigerGraph Cloud, other graph cloud solutions are up to 116x slower on two hop queries, while TigerGraph Cloud uses up to 9x less storage. This translates into direct savings for you.” TigerGraph also announces New Marquee Customers TigerGraph also announced the addition of new customers including Intuit, Zillow and PingAn Technology among other leading enterprises in cybersecurity, pharmaceuticals, and banking. To know more about TigerGraph Cloud in detail, visit its official website. Read Next MongoDB switches to Server Side Public License (SSPL) to prevent cloud providers from exploiting its open source code Google Cloud Storage Security gets an upgrade with Bucket Lock, Cloud KMS keys and more OpenStack Foundation to tackle open source infrastructure problems, will conduct conferences under the name ‘Open Infrastructure Summit’last_img read more

first_imgLast year in November at the Chrome Dev Summit keynote, Google introduced .dev, a domain dedicated to developers and technology. The registration process has already started on Feb 16 and the team is set to launch its Early Access Program. The registration process has already started on Feb 16. According to the timeline shared at the Chrome Dev Summit, the Early Access Program will start on Feb 19th at 8:00 am PST to February 28th at 7:59 am PST. Under this program, users can register available .dev domains by giving an extra fee. This fee will decrease as we get closer to the General Availability phase, which starts February 28 onwards. After registering the domain, users will be required to pay $12/year cost for .dev domains. In addition to a dedicated space for developers, this domain will provide built-in security, as it is included on the HSTS (HTTP Strict Transport Security) preload list. This essentially means that all the connections to .dev websites and pages will be made using HTTPS. Looking at Google’s track record of killing its products over time, some Hacker News users were little skeptical about this service. One user commented, “I wouldn’t lease the domain through Google domains. Use a different registrar — if possible, one that you’ll be able to trust. That registrar will work with the registry of the TLD, which would be google in this case, and has a much better chance of actually resolving issues than if you were a direct customer of Google Domains.” Another user said, “They have a well-established track record of enthusiastically backing exciting new projects way outside of their core competency just to dump them like hot garbage several years later…It doesn’t seem like a smart move to lease a domain from a politically active mega-monopoly that might decide to randomly become your competitor in 2 years.”  Countering this argument, one of the Google developers from the team launching .dev said,  “You’ll be glad to know that TLDs can’t simply be discontinued like other products might be. ICANN doesn’t allow it. The procedures in place preventing a live TLD from shutting down are called EBERO.” Read more about the .dev domain on its official website. Read Next Chromium developers propose an alternative to webRequest API that could result in existing ad blockers’ end Google Chrome developers “clarify” the speculations around Manifest V3 after a study nullifies their performance hit argument Regulate Google, Facebook, and other online platforms to protect journalism, says a UK reportlast_img read more

first_img27Aug Rep. Potvin encourages public opinions, ideas at Dept. of Education forums Categories: News State Rep. Phil Potvin invites residents of the 102nd House District to attend and provide new ideas at public meetings held by the Michigan Department of Education to discuss updates to the state’s science and social studies standards.“The Michigan Department of Education puts a lot of thought into changes in their curriculum, and I encourage all members of our community to attend one of these forums,” said Rep. Potvin, R-Cadillac. “It’s exciting to see our dynamic education system changing right before our eyes.”Members of the community will have an opportunity to comment on proposed changes to Michigan’s science and social studies content standards for the state’s public schools. Developed by education experts from around the nation, the proposed changes have been designed to provide young Michiganders the best educational experience possible.People can learn more about the proposed standards at and in and around the 102nd House District will be held at the following times and locations:Thursday, Sept. 3, from 5 to 8 p.m., located at Central Michigan University’s Bovee University Center in Mt. Pleasant;Monday, Sept. 14, from 5 to 8 p.m., at the Gerald R. Ford Museum, located at 303 Pearl St. NW in Grand Rapids; andMonday, Sept. 21, from 5 to 8 p.m. at the Traverse Bay Intermediate School District, located at 1101 Red Drive in Traverse City.“The quality of our kids’ educations depends on these standards,” Rep. Potvin said. “Because a child’s education is the product of a partnership between teachers and parents, I encourage everyone to bring their thoughts to the table.”No appointment is necessary. Residents with questions about the forum are encouraged to contact Rep. Potvin’s office by phone at 517-373-1747, or by email at read more

first_imgSatellite operator SES has struck a multi-year capacity deal with the Eurovision network to broadcast the London 2012 Paralympic Games and other sporting events internationally.Eurovision is increasing its SES capacity to 54MHz and will use the SES teleport and the NSS-806 satellite at 40.5° West to broadcast the London 2012 Olympic Games and Paralympic Games to TV audiences across Latin America.The new capacity will also be used for contribution and distribution of major 2013-2014 sporting events in high-definition, including the Tour de France, Formula One, European and Latin American Football including 2014 FIFA World Cup in Brazil.last_img read more