Web Development

Drupal how to add/remove database tables dynamically

Drupal - Open Source Content Platform - Thu, 11/15/2018 - 11:42

Our website has several administrators. Each of them moderates a large amount of data.

So website sometimes goes extremely slow. I think that we can separate our main data table into small chunks. So each administrator will be associated with its own database table. What is the best possible approach to implement it?

The nuance is that administrators can be replaced so we need to update database schema (add/remove tables) dynamically somehow. Is there a way to add/remove tables dynamically (not via hook_install)?

submitted by /u/dakruchko
[link] [comments]

Help with graduate thesis regarding IT

webdev: reddit for web developers - Thu, 11/15/2018 - 11:15

Hi guys,

My topic for graduate thesis is: The influence of IT office on employee's productivity and satisfaction.

In order to conclude what office layout suits bets for IT eployees, I need to collect data from people currently working in IT companies so your help is crucial.

I have prepared an online survey to collect data. It takes 4 minutes to complete it and it is anonymous.

If you have spare time please help me get that graduate degree.

Contact me in DM so I could send you the link for survey.

submitted by /u/Pearl_ia
[link] [comments]
Categories: Web Development

I'm Scott Tolinski and I make video tutorials and co-host Syntax AMA

webdev: reddit for web developers - Thu, 11/15/2018 - 10:42
About Scott

Scott Tolinski is a full stack developer from Denver, Colorado. Scott has worked the majority of his career at agencies of all sizes. From a 12 person design shop, to a mega agency with clients like the Ford Motor Company and everything in between.

In his spare time Scott created Level Up Tutorials on Youtube in 2012 to provide free tutorial content where he has released over a thousand free tutorial videos. Now, Scott has left the agency life to work on Level Up Tutorials full time offering a new premium tutorial series every month via https://www.leveluptutorials.com/pro

Scott, along with Wes Bos, host the weekly Syntax podcast which is for web developers looking for tangible takeaways (tasty treats) and smooth ad-read transitions.

Scott is happy to answer any questions related to web development, agency life, learning, running a business, fitness, breakdancing, tea, video production, tutorial creation, gear, hockey, college football, the Detroit Lions and golden era hip hop.



Scott @ Zeit Day 2018 - 2 Fast 2 Furious https://www.youtube.com/watch?v=xK3BhwDRuL8

submitted by /u/stolinski
[link] [comments]
Categories: Web Development

I built my first React Native app, would love some feedback!

webdev: reddit for web developers - Thu, 11/15/2018 - 10:26

Hello r/webdev,

I've been working on this little app for the past few weeks to learn React Native and more importantly, Redux. I would appreciate any feedback regarding the way I've structured the store, set up actions, and handled the reducers. I'm coming from the Vue and Vuex world, Redux has been a struggle to grasp, but this project really helped me understand it better. Anyways, here's the code and app store links if you want to try it out.

GitHub: https://github.com/krestaino/markets-react

App Store: https://itunes.apple.com/us/app/markets-react/id1441913854

Google Play Store: https://play.google.com/store/apps/details?id=com.kmr.marketsreact

submitted by /u/_kmr
[link] [comments]
Categories: Web Development

Scaling CSS: Two Sides of a Spectrum

CSS-Tricks - Thu, 11/15/2018 - 10:16

The subject of scaling CSS came up a lot in a recent ShopTalk Show with Ben Frain. Ben has put a lot of thought into the subject, even writing a complete book on it, Enduring CSS, which is centered around a whole ECSS methodology.

He talked about how there are essentially two solutions for styling at scale:

  1. Total isolation
  2. Total abstraction

Total isolation is some version of writing styles scoped to some boundary that you've set up (like a component) in which those styles don't leak in or out.

Total abstraction is some version of writing styles that are global, yet so generic and re-usable, that they have no unintended side effects.

Total isolation might come from <style scoped> in a .vue file, CSS modules in which CSS class selectors and HTML class attributes are dynamically generated gibberish, or a CSS-in-JS project, like glamerous. Even strictly-followed naming conventions like BEM can be a form of total isolation.

Total abstraction might come from a project, like Tachyons, that gives you a fixed set of class names to use for styling (Tailwind is like a configurable version of that), or a programmatic tool (like Atomizer) that turns specially named HTML class attributes into a stylesheet with exactly what it needs.

It's the middle ground that has problems. It's using a naming methodology, but not holding strictly to it. It's using some styles in components, but also having a global stylesheet that does random other things. Or, it's having lots of developers contributing to a styling system that has no strict rules and mixes global and scoped styles. Any stylesheet that grows and grows and grows. Fighting it by removing some unused styles isn't a real solution (and here's why).

Note that the web is a big place and not all projects need a scaling solution. A huge codebase with hundreds of developers that needs to be maintained for decades absolutely does. My personal site does not. I've had my fair share of styling problems, but I've never been so crippled by them that I've needed to implement something as strict as Atomic CSS (et al.) to get work done. Nor at at any job I've had so far. I see the benefits though.

Imagine the scale of Twitter.com over a decade! Nicolas has a great thread where he compares Twitter's PWA against Twitter's legacy desktop website.

The legacy site's CSS is what happens when hundreds of people directly write CSS over many years. Specificity wars, redundancy, a house of cards that can't be fixed. The result is extremely inefficient and error-prone styling that punishes users and developers alike.

The post Scaling CSS: Two Sides of a Spectrum appeared first on CSS-Tricks.

Categories: Web Development

Second opinion on a design

webdev: reddit for web developers - Thu, 11/15/2018 - 10:03

Hi, our web team have put together a design for the company I work for. I'm not sure on the design so I'd like a second opnion. If you can help please private message me or comment below and I'll Private Message you with a link to the design. Thanks

submitted by /u/bp5678
[link] [comments]
Categories: Web Development

The Annoying Site - What are your thoughts on this?

webdev: reddit for web developers - Thu, 11/15/2018 - 10:02

I came across this website that claimed to be the most annoying website on Twitter. Opened it using the in app browser and just got a picture of a cat.

I posted it here a direct link to it here and a few people voiced their concerns before it got pulled down. (Sorry to those affected)

So a quick search on Twitter has showed me that it’s logged some people out of all their accounts, opened pop ups, increased chrome’s RAM and CPU usage, and supposedly moved a finder window (is that even possible?)

I’ve found it’s github page : GitHub It seems it was made as a tool to aid in discussing the power of web platforms.

What are your thoughts regarding it?

submitted by /u/yajCee
[link] [comments]
Categories: Web Development

Resize on-the-fly for Web

webdev: reddit for web developers - Thu, 11/15/2018 - 09:57

Resize on-the-fly for the web: up to one thousand images per second on just one Tesla V100 GPU

Fastvideo company has been developing GPU-based image processing SDK since 2011, and got some outstanding results for image processing performance on NVIDIA GPU (mobile, laptop, desktop, server). Fastvideo has implemented the first JPEG codec on CUDA, which is still the fastest solution on the market. Apart from JPEG, the company has also released JPEG2000 codec on GPU and SDK with high performance image processing algorithms. SDK offers exceptional speed for many imaging applications, especially in situations when CPU-based solutions are unable to offer either sufficient performance or latency. Below we introduce the “Resize on-the-fly” solution from Fastvideo.

Resize on-the-fly

In various imaging applications we have to do an image resize and quite often we need to resize JPEG images. In such a case the task gets more complicated—we can't resize directly because images are compressed. The solution is not difficult, we just need to decompress the image, then resize and encode it to get the resized image. Nevertheless, we can face some difficulties if we assume that we need to resize many millions of images every day, and there comes questions concerning performance optimization. Now we need not only to get it right, we have to do that very fast. And there is good news: it can be done this way.

In the standard set of demo applications from Fastvideo SDK for NVIDIA GPUs there is a sample application for JPEG resizing. It's supplied both in binaries and with source codes to let users integrate it easily into their software solutions. This is the software to solve the problem of the fast resize (JPEG resize on-the-fly), which is essential for many high performance applications, including high load web services. This application can do a JPEG resize very fast, and users can test the binary to check image quality and performance.

If we consider high load web application as an example, we can formulate the following task: we have a big database of images in JPEG format, and we need to perform fast resize for these images with minimum latency. This is also a problem for big sites with responsive design: how do you prepare a set of images with optimal resolutions as fast as possible to minimize traffic?

At first we need to answer the question—“Why JPEG?” Modern internet services mostly receive this file type from their users, who create them with mobile phones or cameras. For such a situation, JPEG is a standard and reasonable choice. Other formats on mobile phones and cameras do exist, but they are not so widespread as JPEG. Many images are stored as WebP, but that format is still not as popular as JPEG. Moreover, encoding and decoding of WebP images are much slower in comparison with JPEG, and this is also very important.

Quite often, such high load web services utilize sets of multiple image copies of the same image with different resolutions to get low latency response. That approach leads to extra expenses on storage, especially for high performance applications, web services and big image databases. The idea to implement a better solution is quite simple: we can try to store just one JPEG image on the database instead of an image series and to transform it to a desired resolution on the fly, which means at a very fast speed and with minimum latency.

How to prepare an image database

We will store all images in the database at JPEG format, but it is not a good idea to utilize them “as is”. It’s important to prepare all images in the database for future fast decoding. That is why we need to pre-process offline all images in the database to insert so called “JPEG restart markers” into each image. JPEG Standard allows such markers and most JPEG decoders can easily process JPEG images with these markers without problem. Most smart phones and cameras don’t produce JPEGs with restart markers, so we can add these markers with our software. This is a lossless procedure, so we don’t change image content, though file size will be slightly larger after that.

To make a full solution efficient, we can utilize some statistics about user device resolutions which are the most frequent. Users utilize their phones, laptops or PCs to see pictures, and quite often these pictures need just a part of the screen, so image resolutions should not be too big, and this is the ground to conclude that most images from our database could have resolutions of no more than 1K or 2K. We will consider both choices to evaluate latency and performance. In this case, if we need a bigger resolution for a user’s device, we can just resize with an upscaling algorithm. Still, there is a possibility to choose bigger default image resolution for the database but the general solution will be the same.

For practical purposes we consider JPEG compression with parameters which correspond to “visually lossless compression”. This means JPEG compression quality around 90% with subsampling 4:2:0 or 4:4:4. To evaluate the time of JPEG resize for testing we choose downscaling to 50% both for width and height. In real life we could utilize various scaling coefficients, but 50% could be considered a standard case for testing.

1. Algorithm description for JPEG Resize on-the-fly software

This is exactly what we do for fast JPEG resizing in our software:

  1. Copy JPEG images from database to system memory

  2. Parse JPEG and check EXIF sections (orientation, color profile, etc.)

  3. If we see a color profile on the JPEG image, we read it from the file header and save it for future use

  4. Copy JPEG image from CPU to GPU memory

  5. JPEG decoding

  6. Image resize according to Lanczos algorithm (50% downscaling as an example)

  7. Sharpening

  8. JPEG encoding

  9. Copy new image from GPU to system memory

  10. Add previously saved color profile to the image header (to EXIF)

We could also implement the same solution with better precision. Before the resize we could apply reverse gamma to all color components of the pixel in order to perform a resize in linear space. Then we will apply that gamma to all pixels right after sharpening. Visual difference is not big, though it's noticeable, and computational cost for such an algorithm modification is low, so it can be easily done. We just need to add reverse and forward gamma to the image processing pipeline on GPU.

There is one more interesting approach to solve the same task of a JPEG resize. We can do JPEG decoding on a multicore CPU with libjpeg-turbo software. Each image could be decoded in a separate CPU thread, though all the rest of the image processing is done on GPU. If we have a sufficient number of CPU cores, we could achieve high performance decoding on CPU, though the latency will degrade significantly. If the latency is not our priority, then that approach could be very fast as well, especially in cases where the original image resolution is small.

General requirements for a fast jpeg resize

The main idea is to avoid storing dozens of copies of the same image with different resolutions. We can create a necessary image with the required resolution immediately right after receiving an external request. This is the way to reduce storage size, because we need to have just one original image instead of a series of copies.

We have to accomplish JPEG resize tasks very quickly. That is the matter of service quality due to our fast response to client’s requests.

Image quality of resized version should be high.

To ensure precise color reproduction, we need to save the color profile from the EXIF of the original image.

Image file size should be as small as possible and image resolution should coincide with the window size on the client’s device: а) If image size is not the same as window size, then a client’s device (smart phone, tablet, laptop, PC) will apply a hardware-based resize right after image decoding on the device. In OpenGL such a resize is always bilinear, which could create some artifacts or moire on the images with high-frequency detail. b) Screen resizing consumes extra energy from the device. c) If we consider the situation with multiple image copies at different resolutions, then in most cases we will not be able to exactly match image resolution with window size, and that's why we will send more traffic.

1. Full pipeline for web resize, step by step

  1. We collect images from users in any format and resolution.

  2. In offline mode with ImageMagick, which supports various image formats, we transform original images to standard 24-bit BMP/PPM format, apply a high quality resize with a downscale to 1K or 2K, then do JPEG encoding which should include restart marker embedding. The last action could be done either with the jpegtran utility on CPU or with Fastvideo JPEG Encoder on GPU. Both of them can work with JPEG restart markers.

  3. Finally, we create a database of such 1K or 2K images to work with further.

  4. After receiving a user’s request, we get the full info about the required image and its resolution.

  5. Find the required image from the database, copy it to the system memory and notify resizing software that a new image is ready for processing.

  6. On GPU we do the following: decoding, resizing, sharpening, encoding. After that the software copies the compressed image to system memory and adds a color profile to EXIF. Now the image is ready to be sent to the user.

  7. We can run several threads or processes for a JPEG resize application on each GPU to ensure performance scaling. This is possible because GPU occupancy is not high while working with 1K and 2K images. Usually 2-4 threads/processes are sufficient to get maximum performance at a single GPU.

  8. The whole system should be built on professional GPUs like NVIDIA Tesla P40 or V100. This is vitally important, as the NVIDIA GeForce GPU series is not intended for 24/7 operation with maximum performance for years. NVIDIA Quadro GPUs have multiple monitor outputs which are not necessary in the task of fast jpeg resize. Requirements for GPU memory size are very low and that's why we don’t need GPUs with large amount of GPU memory.

  9. With additional optimization issues, we can also create a cache for most frequently processed images to get faster access for such images.

1. Software parameters for JPEG resize

Width and height of the resized image could be arbitrary and they are defined with one pixel precision. It's a good idea to preserve the original aspect ratio of the image, though the software can also work with any width and height.

We utilize JPEG subsampling modes 4:2:0 and 4:4:4.

We can get maximum image quality with 4:4:4, though minimum file size corresponds to 4:2:0 mode. We can do subsampling because the human visual system better recognizes luma image component rather than chroma.

JPEG image quality and subsampling for all images in the database.

We do sharpening with a 3×3 window and we can control sigma (radius).

We need to specify the JPEG quality and subsampling mode for output images as well. It’s not necessary that these parameters should be the same as for the input image. Usually with JPEG quality, 90% is considered to be visually lossless, which means that the user can’t see compression artifacts at standard viewing conditions. In a general case, one can try JPEG image quality up to 93-95%, but then we will have bigger file sizes both for input and output images.

1. Important limitations for a web resizer

We can get very fast JPEG decoding on GPU only in cases where we have built-in restart markers in all our images. Without these restart markers one can’t make a JPEG decoding parallel algorithm and we will not be able finally to get high performance at the decoding stage. That’s why we need to prepare the database with images which have a sufficient number of restart markers.

At the moment, we believe a JPEG compression algorithm is the best choice for such a task because the performance of a JPEG Codec on GPU is much faster in comparison with any competitive formats/codecs for image compression and decompression: WebP, PNG, TIFF, JPEG2000, etc. This is not just the matter of format choice, that is the matter of available high-performance codecs for these image formats.

Standard image resolution for a prepared database could be 1K, 2K, 4K or anything else. Our solution will work with any image size, but total performance could be different.

1. Performance measurements for resizing 1K and 2K jpg images

We’ve done testing with NVIDIA Tesla V100 (OS Windows Server 2016, 64-bit, driver on 24-bit images 1k_wild.ppm and 2k_wild.ppm with resolutions 1K and 2K (1280×720 and 1920×1080). Tests were done with a different number of threads running at the same GPU. To process 2K images we need around 110 MB of GPU memory per one thread; for four threads we need up to 440 MB.

At the beginning we encoded test images to JPEG with a quality of 90% and subsampling 4:2:0 or 4:4:4. Then we ran a test application, did decoding, resizing, sharpening and encoding with the same quality and subsampling. Input JPEG images resided at system memory, and we copied the processed image from GPU to the system memory as well. We measured timing for that procedure.

Command line example to process 1K image:

PhotoHostingSample.exe -i 1k_wild.90.444.jpg -o 1k_wild.640.jpg -outputWidth 640 -q 90 -s 444 -sharp_after 0.95 -repeat 200

JPEG subsampling 4:2:0 for an input image leads to slower performance, but image sizes for input and output images are less in that case. For subsampling 4:4:4 we get better performance, though image sizes are bigger. Total performance is mostly limited by the JPEG decoder module and this is the key algorithm to improve to get faster solutions in the future.

1. Resume

From the above tests we see that on just one NVIDIA Tesla V100 GPU, resize performance could reach 1000 fps for 1K images and 900 fps for 2K images at specified test parameters for a JPEG Resize. To get maximum speed, we need to run 2-4 threads on the same GPU.

Latency around just one millisecond is very good result. To the best of our knowledge, one can’t get such a latency on CPU for that task and this is one more important vote for GPU-based resizes of JPEG images at high performance professional solutions.

To process one billion JPEG images with 1K or 2K resolutions per day, we need up to 16 NVIDIA Tesla V100 GPUs for a JPEG resize on-the-fly task. Some of our customers have already implemented that solution at their facilities, while others are currently testing that software.

Please note that a GPU-based resize could be very useful not only for high load web services, but also for much more high performance imaging applications where a fast resize could be really important. For example, it could be utilized at the final stage of almost any image processing pipeline before image output to monitor. That software can work with any NVIDIA GPU: mobile, laptop, desktop, server.

1. Benefits of a GPU-based JPEG resizer

  • Reduced storage size

  • Less infrastructure costs on initial hardware and software purchasing

  • Better quality of service due to low latency response

  • High image quality for resized images

  • Minimal traffic

  • Less power consumption on client devices

  • Fast time-to-market software development on Linux and Windows

  • Outstanding reliability and speed of heavily-tested resize software

  • We don't need to store multiple image resolutions, so we don't have additional load to file system

  • Fully scalable solution which is applicable both to a big project and to a single device

  • Better ROI due to GPU usage and faster workflow

1. To whom it may concern

Getting a fast resize of JPEG images is definitely the issue for high load web services, big online stores, social networks, online photo management and sharing applications, e-commerce services and enterprise-level software. Fast resizes can offer you better results in less time and at a lower cost.

Software developers benefit from a GPU-based library with latency in the range of several milliseconds to resize jpeg images on GPU.

That solution could also be a rival to the NVIDIA DALI project for fast jpeg loading at the training stage of Machine Learning or Deep Learning frameworks. We can offer super high performance for JPEG decoding together with resizing and other image augmentation features on GPU to make that solution useful for fast data loading at CNN training. Please contact Fastvideo if you are interested.

1. Roadmap for a jpeg resizing algorithm

Apart from JPEG codecs, resizing and sharpening, we can also add:

  1. Cropping, color correction, gamma, brightness, contrast, rotations to 90/180/270 degrees - these modules are all ready

  2. Advanced file format support (JP2, TIFF, CR2, DNG, etc.)

  3. Parameter optimizations for NVIDIA Tesla P40 or V100

  4. Further JPEG Decoder performance optimization

  5. Implementation of batch mode for image decoding on GPU

Useful links

  1. Full list of features from Fastvideo Image Processing SDK: https://www.fastcompression.com/products/sdk/sdk.htm

  2. Benchmarks for image processing algorithms from Fastvideo SDK: https://www.fastcompression.com/pub/2018/Fastvideo_SDK_benchmarks.pdf

P.S. Latest benchmarks for JPEG Resize on one NVIDIA Tesla V100 GPU for 1-Mpix images reach 1400 images per second.

submitted by /u/veeableful
[link] [comments]
Categories: Web Development

Why monday.com is the Universal Team Management Tool for Your Team

CSS-Tricks - Thu, 11/15/2018 - 09:52

This platform is perfect for teams sized at 2-to-200 — and gives every employee the same level of transparency.

Every project management tool seeks to do the same instrumental thing: keep teams connected, on task and on deadline to get major initiatives done. But the market is getting pretty crowded, and for good reason — no platform seems to have gotten the right feel for what people need to see, and how that information should be displayed so that it’s both actionable/relevant and contextualized.

That’s why monday.com is worth a shot. The platform is based off a simple, but powerful idea: that as humans, we like to feel like we’re contributing to part of a greater/effort good — an idea that sometimes gets lost in the shuffle as we focus on the details of getting stuff done. So projects are put onto a task board (think of it like a digital whiteboard), where everyone can have the same level of visibility into anyone else who’s contributing a set of tasks. That transparency breaks down the silos between teams that cause communication errors and costly project mistakes — and it’s a beautiful, simple way to connect people to the processes that drive forward big business initiatives.

Whether you’re part of a tech-forward team or not, monday.com is a welcome relief to cumbersome Excel files, messy (physical) whiteboards, or meetings that waste time when actual work could be completed. The scalable, intuitive structure can effectively work for a team of two, or an international team of 2,000+ — and a beautiful, color-coded board lays out tasks you can cleanly see and tag for various stages of completion. That way, employees can see exactly what needs to be done (and who needs to do it), while managers can optimize their time re-allocating resources as necessary to optimize processes. It’s a win-win.

monday.com also allows teams to communicate within the platform, cutting down on the amount of laborious sifting through various email threads to figure out a workflow. Messages can be sent inside of tasks — so all the communication is contextualized before meeting resolution or seeking it. The platform also supports uploads, so documents and videos can be added to facilitate more collaboration, and integration with other productivity apps. So if your team is already using tools like Slack, Google Calendar, Dropbox, Microsoft Excel, Trello, and Jira, there’s specific, clean shortcuts to integrate the information from those platforms into monday.com. And even beyond team communication and management, you can use monday.com for client-facing exchanges, so all your messages are consolidated into a single place.

The platform recently raised $50M in funding, and received nods from the likes of Forbes, Entrepreneur, Business Insider, and more for its ability to empower international teams to do better work together. Best of all, unlike other team management software, which can be pricey and time-intensive to scope, test and run, you can try monday.com today — for free.

What can this app do?
  • Creating and managing a project’s milestones
  • Creating and assigning tasks
  • Attaching files to any project’s table projects on the go.
  • Using mobile applications to manage projects
  • Communicating with your team members
  • Updating team using the news feed
  • Keeping clients in the loop
  • Organizing the organization into teams
  • Creating detailed project charts and reports
  • Tracking the time your team members spend on tasks
  • Managing a project's financials
  • Website as well as a desktop app for Mac and Windows

monday.com to make every user feel empowered and part of something bigger than their own individual tasks, and as a result, to boost collective productivity and transparency.

The post Why monday.com is the Universal Team Management Tool for Your Team appeared first on CSS-Tricks.

Categories: Web Development

Project management for Selft-taught webdev's first REAL assigment

webdev: reddit for web developers - Thu, 11/15/2018 - 09:49

I work as a mechanical engineer for a company, half time of which I have spent time preparing and giving courses in this field. Over the last 3 years I've learned webdev in my spare time, from simple HTML and Javascript up to the MERN stack right now. I've made small educational/scientific one-page web apps to use in my courses to aid understanding. They where easy to make as the scope stayed small

Now, I've gotten involved in a start-up, where the idea is to develop a whole platform, having the complexity of something like the kickstarter website; authentication , adding entries, paying systems, but also document upload and bidding system. And this is not even half of the project. Over time I'll be more and more involved and I'll have to work with other web dev's as well.

All of these individual features I could make but the task to make everything fit together is daunting for me.

My question is, are there any books/online courses or other recourses about the project management side of creating big web applications, to make all of the pieces fit together while still having an overview? Also, I want to delegate some features to other web devs, how do I communicate this in the best way possible?

I remember I learned about UML in my university years, but applied to Java and C++/#. Any standard like this for web apps?

submitted by /u/eggman0
[link] [comments]
Categories: Web Development

Am I the only one that hates hero sections?

webdev: reddit for web developers - Thu, 11/15/2018 - 09:22

You know, the ones with a huge image over all your screen, a slogan that tells nothing and maybe a link to one more such page.

I see them purely as waste of space. I don't care about your slogan or that stock image, I came to this site because I need some information. Why is my screen empty of any content whatsoever?

Why do designers keep drawing and including them in designs? I feel like the site is expecting idiot visitors when I have to implement these sections.

submitted by /u/Tontonsb
[link] [comments]
Categories: Web Development

Drupal Dev Thursdays: Post here with development questions or discussion

Drupal - Open Source Content Platform - Thu, 11/15/2018 - 09:06

This is the weekly thread for development questions or chit-chat that doesn't belong in the Monday Beginner Questions thread. All questions/comments/ranting about Drupal dev is fair game.

(Check out the weekly post schedule in the sidebar)

submitted by /u/AutoModerator
[link] [comments]